

Compute power seems to be the same, disregarding any architecture changes. If they manage to pull off equal performance at a lower price and lower power envelope, it’ll actually be quite something.
Compute power seems to be the same, disregarding any architecture changes. If they manage to pull off equal performance at a lower price and lower power envelope, it’ll actually be quite something.
ATI, sorry meant AMD, have tried the low price approach. They achieved nothing in terms of market share, the green fans didn’t find lower price and better performance enough to be swayed, and only ended up hurting their bottom line. There is nothing in it for AMD.
What has AMD to gain by dumping prices to sooth the nvidia-first crowd?
Yeah… In fourth grade I was taught that there is nothing like an outer foe to create inner peace. I never imagined it to be the US to accomplish that, but here we are.
That was my first thought as well. Why doesn’t his kids want to live with him?
Well… That just might qualify as a hotel room:)
Jokes aside, this is one of the few applications of “car use” I’m ok with. It’s predominantly non-urban, it’s not intended for every day use and is for use on off days getting away. If all commuting was done using collective types of transportation and on electricity, RVs would not be much of a concern.
Until everyone started free camping clotting up the forests, that is…
I’m going out on a limb here. On average car drivers and cyclists are equally rubbish in traffic. After that it becomes a numbers game. I don’t see any reason why the mode of transportation has any bearing on my skills in traffic.
As I told my kids when they started venturing out in traffic by themselves:
Me: expect anyone in traffic to be a moron Them (in a got ya-snicker): But that means you, too! Me: yes.
We all have bad days in traffic, regardless of how many wheels are at our disposal. Plan for it.
No, as a driver you are always responsible for your actions.
However, being a driver does not absolve any cyclists of their responsibility as cyclists. In this case the issue is the lack of reflectors and lack of bike lights. That is part of the responsibilities that comes with being a biker.
Any reference to race, outside of the reflective properties of different colors that might actually be relevant in this case, are yours.
At a speed at which the driver has time to break whatever the driver may encounter on that road.
Wild life are notoriously bad at wearing high Viz clothing, although I’ve heard the Finns are making progress on the issue.
I can’t decide if this is real, ai or cyberpunk 2077…
As I get it, it’s about shifting the perspective from everything is for cars and pedestrians are an after thought to something like everything is for pedestrians except this particular piece of road where cars may drive. From “car first”, to “pedestrian first”.
I’m just in the beginning, but my plan is to use it to evaluate policy docs. There is so much context to keep up with, so any way to load more context into the analysis will be helpful. Learning how to add excel information in the analysis will also be a big step forward.
I will have to check out Mistral:) So far Qwen2.5 14B has been the best at providing analysis of my test scenario. But i guess an even higher parameter model will have its advantages.
Thank you! Very useful. I am, again, surprised how a better way of asking questions affects the answers almost as much as using a better model.
I need to look into flash attention! And if i understand you correctly a larger model of llama3.1 would be better prepared to handle a larger context window than a smaller llama3.1 model?
Thanks! I actually picked up the concept of context window, and from there how to create a modelfile, through one of the links provided earlier and it has made a huge difference. In your experience, would a small model like llama3.2 with a bigger context window be able to provide the same output as a big modem L, like qwen2.5:14b, with a more limited window? The bigger window obviously allow more data to be taken into account, but how does the model size compare?
Thank you for your detailed answer:) it’s 20 years and 2 kids since I last tried my hand at reading code, but I’m doing my best to catch up😊 Context window is a concept I picked up from your links which has provided me much help!
The problem I keep running into with that approach is that only the last page is actually summarised and some of the texts are… Longer.
Do you know of any nifty resources on how to create RAGs using ollama/webui? (Or even fine-tuning?). I’ve tried to set it up, but the documents provided doesn’t seem to be analysed properly.
I’m trying to get the LLM into reading/summarising a certain type of (wordy) files, and it seems the query prompt is limited to about 6k characters.
I couldnt disagree more with you. If there are pedestrians nearby you drive slow and keep your distance regardless of where you drive.
The same goes for pedestrians, though. Don’t walk where it’s not safe, for everyones safety. Like the interstate. It’s a shared responsibility.
This, however, is in the middle of a neighborhood where a ball and a kid could come flying at moments notice…
That’s because you already have top notch gear:) For everyone that’s not already at 7900 level performance 90% of the performance at 80% of the price is great.
For those that are able to spend 80% of the price, that is.