Unity on Graphics Downgrades: Setting Expectations Too High Can Backfire; Don’t Develop on High-End PC
Ever since the infamous Killzone 2 E3 demo, graphics downgrades have become an increasingly popular topic among the gaming community. That usually happens when a game is revealed with a trailer showing stunning graphics and then, as the release date draws near, the new trailers and images depict a lower graphical quality.
This happened to several high profile releases such as Watch_Dogs, Dark Souls 2, The Witcher III, The Division, Assassin’s Creed Unity, Far Cry 3/4, Forza Motorsport 5, Motorstorm and more (though in my opinion, most of these games still looked fantastic at maximum settings on PC). The phenomenon in question generally happens because developers try their best to impress the public without necessarily taking into account the requirements of the final game and the constraints of the hardware itself.
Mathieu Muller, Field Engineer at Unity Technologies, gave a piece of advice to game developers all around the world on graphics downgrades while speaking to DSO Gaming.
Setting expectations too high can backfire. On the other hand, trying to show the best of your technology is always interesting, because this is how you push the limits and differentiate from others. Another factor to take into account is that shipping a game can sometimes be almost the hardest thing on earth, and almost everyone has to cut a lot of things in the last months of development. Experienced studios have developed tools and methodologies to reduce that risk. For indies, things can go wild, if they have not done unit testing, continuous integration, tools to easily analyse and reproduce problems. And the Number One rule is: do not develop your game on a high-end PC!
Those are definitely sensible recommendations. After all, there is nothing to gain by setting expectations too high and developing on high-end PCs, only a lot to lose, as hopefully most developers will have learned by now.