I even heard people being surprised it’s not Geralt. When they were surprised I started to question myself if I just dreamed that they announced that wayyyy back.
I even heard people being surprised it’s not Geralt. When they were surprised I started to question myself if I just dreamed that they announced that wayyyy back.
Or preferably: don’t care about the game at all until it releases. Ignore previews or alpha demos, beta footage, gameplay trailers/teasers, etc. That way you don’t build up hype that has a big chance to disappoint you. Take the game for what it is at release and either like it then or not.
Ah, good to know. Thanks!
There is some documentation in the forum about how to add new device support or where to request it. Read it and then decide what to do.
Even CD Project Red added such shit. Instead of directly launching Witcher or Cyberpunk I now have to go through a(nother) launcher now. Pointless.
Baldurs Gate 3 needed one from the beginning as well.
I don’t get it.
What ZigBee Coordinator do you use? I know deconz and zigbee2mqtt have the ability to add support for new devices via config files. But that’s a bit of a rabbit hole into the ZigBee protocol. They also have forums/issue trackers where one can request support for new devices.
Stalwart is 95% awesome. What holds me back is, that Mails are stored in a Database and not Maildir. Maildir is insanely trivial to backup incrementally and to restore individual mails if necessary. That currently holds me on dovecot.
But that’s the neat thing: the system is well structured into different layers and subcomponents. They are not all involved to control lightbulbs; that’s mostly your local hue bridge. One component will make sure, Alexa can control your bulbs (if you want that). If that component fails, only Alexa stops working. Another component handles push notifications to your mobile devices. If that fails, the rest is unimpacted. And so on.
That was, for a long time, the main reason I heavily recommended Hue: the bridge can be used completely offline and still offered a good local API and pairing system. Unfortunately last year that made online accounts a requirement. I assume besides the App you can still use many things even if your network connection is broken, though.
It’s more comparable to Snikket. Both Snikket and Prose use Prosody as server with their own extensions.
You could look into prose. The interface of slack/discord/mattermost, built on XMPP, with E2EE.
What’s wrong with that? Do you expect their backend to run off a single server with a little PHP script? The components seem pretty reasonable (with the actual business logic being just a small part).
Bitwardens local cache does not include attachments, though. If you rely on them, you have to rely on the server being available.
While I like and appreciate the campaign, the issue IMO is bigger. IoT devices for example even have environmental impact when services behind them get discontinued.
I would therefore like a more general rule: whenever a product is discontinued for whatever reason, all necessary documents, sources, etc need to be released to allow third parties to take over maintenance (that also includes schematics for hardware repairs).
Unreal Tournament
I wonder if that would be a genuine use case for “AI”. If the voice actor consents to have his voice represented in such a scene but doesn’t want to play it out in a studio, the computer model could take over that part.
It’s an okay game, but far worse than the first two. They forced an open world onto it, and made it pretty repetitive. The DLC is more linear and feels a lot more like a typical Mafia story telling.
Its not said that they need devs to target home machines, it says they need to give the resources so people can host it themselves, period.
Before attacking me with such an arrogant rant, maybe read what I wrote.
I said:
Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
So of course it’s about releasing anything (!) at all.
I simply said that you can’t compare a small fan project like a WoW self hosted server with Blizzards infrastructure and the requirements to have a high available setup for millions of players.
ArenaNet is quite open about their infrastructure and you can see that this is far from trivial, but also allows them to have zero downtime updates. That is a huge feat, but also means that self hosting that thing will be a pain in the ass. Yet I would not want them to not do this just so it could be easily (!) self hosted some time in the distant future.
Such an architecture is typically shit. Building a system that is simple AND scales high won’t work. Complexity usually gets added to cope with scale. If we don’t allow companies to build scalable (i.e. complex) systems, we simply won’t get such games anymore.
Again: I am completely in favor of forcing devs to release everything necessary to host it. I am not in favor of forcing devs to target home machines for their servers, when their servers clearly have completely different requirements. That’s unrealistic.
Not a fair comparison. The private servers were written with the small hosting in mind. They would very likely never scale to what Blizzard has in place. For all I know, Blizzard could run their stuff on a Mainframe with specific platform optimizations against an IBM DB2.
But I also don’t think this has to be transferable to a local setup without effort either. Once they release the source, people can refactor or reengineer it to run on smaller scale, replace proprietary databases with free ones, etc.
The second one gives you the necessary flashbacks to catch up if you should intend to follow the story. It also explains all the basics of the game mechanics as part of the quests.