

If you like RPGs, I highly recommend giving it a try.
At 15€ or less certainly, they can get fucked with 70€.
there’s no reason not to check it out!
I wouldn’t pay Microsoft for a subscription if it was the last way to play games left.
If you like RPGs, I highly recommend giving it a try.
At 15€ or less certainly, they can get fucked with 70€.
there’s no reason not to check it out!
I wouldn’t pay Microsoft for a subscription if it was the last way to play games left.
Windows 11 is shit and you’d be much happier with Fedora. You don’t need a nrćew computer, just install it on this one. That said, you can move the start button to the left and even install https://github.com/Open-Shell/Open-Shell-Menu if you force yourself to try sticking with Windows
My “get fucked” goes to every hosted LLM. If it’s not running on my hardware, I’m not interested. I don’t want it to gaslight me according to someone’s instructions or shill for a product /service (which, you know is coming). The difference is like waiting in line to speak to an expert at the counter vs. having said expert blindfolded and chained in your basement.
there is nothing stopping them
how can they do anything?
there is nothing stopping them from doing the same bullshit with firmware updates to the kobos and drm updates to the store and apps.
I never connect the Libra to any network, how can they do anything? I did actually install some updates since there were a few annoying bugs, but I just downloaded the firmware on the pc from https://pgaskin.net/KoboStuff/kobofirmware.html and updated it offline. Now all those bugs seem fixed and poor Kobo still hasn’t seen the interwebs
If it’s single player, then it being “a dead game” (i.e. not selling enough by whatever standards) is completely beside the point.
You put a few GPTs in a trenchcoat and they’re obviously AI. I can’t speak about openAIs offerings since I won’t use it as a cloud service, but local deepseek I’ve tried is certainly AI. People are moving the goalposts constantly, with what seems to me a determination to avoid seeing the future that’s already here. Download deepseek-v2-coder 16b if you have 16GB of ram and 10gb of storage space and see for yourselves, it’s ridiculously low requirements for what it can do, it uses 50% of four cpu cores for like 15 seconds to solve a problem with detailed reasoning steps.
OK, it just spits predicted tokens, but in answer to what you asked and sensitive to the context you provided and its predictions are arranged such that when you decode them into language they present evidence or arguments used in thinking or argumentation. It also forms conclusions, inferences and produces results to problems, if you allow me to recycle from a dictionary definition of “reasoning”. It’s not perfect and obviously you can’t cram a huge amount into a 16b distillation and it certainly can get things wrong, but you have to squint to not see reasoning when you ask it to guesstimate something or solve a mathematical problem. It is an LLM but there’s reasoning coming out?
I use a 16b reduction of deepseek-r1 on my pc at home and it’s definitely not total bullshit. It’s 10gb of local model that can solve mathematics and physics problems for you or program in python or bash. It doesn’t hallucinate (or I haven’t been able to elicit it), it’s aware of the extents of its knowledge. It works incredibly fast on an old ryzen 1600 with 6600xt. Having an open source reasoning AI that takes 10gb of SSD and about 13 gb of ram is so weird that the only thing weirder is seeing smart people dismiss it as bullshit out of hand.
That’s what he meant by we’ll use sticks on the other side
It needs fresh dick hashes for up to date dick recognition
I use 14b and it’s certainly great for my modest highschool physics and python (to help the kids) needs, but for party games and such it’s a drag its pop culture stops at mid 2023
It’s actually hilarious when you zoom their test images and try to see how much more clothes the emperor has with them newfangled ai clothes.
Run it using ollama in a terminal (like ollama run model_name), ask it a question.
Jeeez, just copy the ollama’s directory (something like .ollama) from user’s dir to wherever. You can check and find the files inside. I find the published 14b really useful, it’s ten GB that think and reason in english.
Two was horrible, the end boss skeleton is the stupidest shit. I liked the first, endured the second to the end and never touched the third or Andromeda
He was playing using a neuralink prototype, that’s why he looked as if he was abusing ketamine
It’s not even Lionel Richie.
Did you just all-black-people-look-the-same yourself
It’s really a smart attack that’s not easily countered and since the weapon is an open source license everyone benefits from tech getting into public domain. RL has shown there’s even less of a moat than we thought initially.