• 2 Posts
  • 340 Comments
Joined 1 year ago
cake
Cake day: June 1st, 2023

help-circle
  • Epic have come a long way from Epic MegaGames, and it isn’t always a fairytale story I suppose.

    Someone here on Lemmy highlighted that quite nicely when Valve dropped their Half Life documentary. Valve embraces their past. They cherish it. They still maintain their old games to honor their success.

    Epic on the other hand completely wiped old Unreal titles from the relevant stores and don’t give a fuck about supporting any of them. Which is a shame. Also I admire the tech behind of modern Unreal engines, so there are still geniuses at work who are likely passionate. Too bad they essentially only ride the Fortnite train outside their engine development.




  • aksdb@feddit.detoMemes@lemmy.mlYeah, well...
    link
    fedilink
    arrow-up
    3
    ·
    6 months ago

    That is - IMO - what critical thinking is meant to be … thinking about alternative explanations and evaluating their viability or probability.

    Unfortunately a lot of people use the term “critical thinking” as just another way to rationalize why they are against something, without actually weighing the options.





  • I would consider Todd Howard to be part of development (since he directs the creative and narrative angle, from what I understand).

    He defended bad performance with “get better hardware”. He defended criticism of the content with “you play the game wrong”.

    Both are bullshit “excuses”. The first one was even debunked by modders who showed that there was potential for optimization. And modders are far more limited than engine devs. The game doesn’t look ugly, but there are far better looking games with more scene complexity out there that run better.

    And “you play it wrong” is bullshit because if enough people play it wrong to have an effect on the rating of the game, then the game is badly designed. Part of game design is making sure the game explains itself or subtly pulls players in the right direction. Either they failed with that, or there simply is no clear direction. But that’s not the players fault.








  • As with every software/product: they have different features.

    ZFS is not really hip. It’s pretty old. But also pretty solid. Unfortunately it’s licensed in a way that is maybe incompatible with the GPL, so no one wants to take the risk of trying to get it into Linux. So in the Linux world it is always a third-party-addon. In the BSD or Solaris world though …

    btrfs has similar goals as ZFS (more to that soon) but has been developed right inside the kernel all along, so it typically works out of the box. It has a bit of a complicated history with it’s stability/reliability from which it still suffers (the history, not the stability). Many/most people run it with zero problems, some will still cite problems they had in the past, some apparently also still have problems.

    bcachefs is also looming around the corner and might tackle problems differently, bringing us all the nice features with less bugs (optimism, yay). But it’s an even younger FS than btrfs, so only time will tell.

    ext4 is an iteration on ext3 on ext2. So it’s pretty fucking stable and heavily battle tested.

    Now why even care? ZFS, btrfs and bcachefs are filesystems following the COW philisophy (copy on write), meaning you might lose a bit performance but win on reliability. It also allows easily enabling snapshots, which all three bring you out of the box. So you can basically say “mark the current state of the filesystem with tag/label/whatever ‘x’” and every subsequent changes (since they are copies) will not touch the old snapshots, allowing you to easily roll back a whole partition. (Of course that takes up space, but only incrementally.)

    They also bring native support for different RAID levels making additional layers like mdadm unnecessary. In case of ZFS and bcachefs, you also have native encryption, making LUKS obsolete.

    For typical desktop use: ext4 is totally fine. Snapshots are extremely convenient if something breaks and you can basically revert the changes back in a single command. They don’t replace a backup strategy, so in the end you should have some data security measures in place anyway.

    *Edit: forgot a word.






  • I don’t blame the engine. There are other studios out there with custom engines that evolved over time. Also Creation Engine evolved a lot.

    That they work with many connected scenes instead of a continuous world also has advantages … it allows them to easily change the “world” between scenes by simply linking you “back” to a different scene (for example city under siege which before the dialog was not under siege). It’s how they work. They could do the same shit with Unreal if they wanted to and if they believe this kind of game design is the only feasible for their story telling, they would shove it into another engine as well.

    I also don’t think the game feels “old”. I do think it feels like it is conceptionally unfinished. They had many ideas and you can see a lot of different systems in the game (space fights, planets with different biomes, ship building, base building, and so on and so forth). Each of these systems in itself has some kind of concept, but all these systems together are missing a clear concept, IMO.

    From what I know, game dev typically works in modules that get thrown together. And this also seems to be the case here. However the “big picture” wasn’t refined or they realized that it needs a ton of small adjustments all over the place (conceptionally AND technically) to make sense of it and it looks like they were not able to deal with the complexity of that.

    As a result we have a game that is okayish. It tells some stories, and offers a lot of content, but it feels not nearly as stunning as it should have and it’s not on a single front ground breaking.