OpenGL is deprecated in 10.14 Mojave, according to the latest announcement for macOS.
Developers still be able to use OGL in macOS, but this makes it official that Apple will no longer be updating or adding to it. I know that many games and apps use OpenGL and so developers will be having to adapt to using the new Metal and Metal Performance Shaders. In my opinion, developing for the Mac now requires an attitude of evolve or perish. I suspect for many indie developers, Mac support will no longer be worthwhile, especially when it is so difficult and unfriendly to unexperienced users.
I would be interested to see what you all have to contribute on this matter, as fellow developers.
The deprecation of OpenGL from macOS
Forum rules
Ren'Py specific questions should be posted in the Ren'Py Questions and Annoucements forum, not here.
Ren'Py specific questions should be posted in the Ren'Py Questions and Annoucements forum, not here.
- Imperf3kt
- Lemma-Class Veteran
- Posts: 3791
- Joined: Mon Dec 14, 2015 5:05 am
- itch: Imperf3kt
- Location: Your monitor
- Contact:
Re: The deprecation of OpenGL from macOS
I doubt this affects Ren'Py, as Ren'Py uses OpenGLES2.0, which is not the same as OpenGL
Warning: May contain trace amounts of gratuitous plot.
pro·gram·mer (noun) An organism capable of converting caffeine into code.
Current project: GGD Mentor
Twitter
pro·gram·mer (noun) An organism capable of converting caffeine into code.
Current project: GGD Mentor
- Fuseblower
- Regular
- Posts: 189
- Joined: Sat Jan 16, 2016 6:01 pm
- Projects: Mall Macabre, Slushball Slasher, Doomed Diner, Tenkeiteki Tokyo
- itch: fuseblower
- Location: Netherlands
- Contact:
Re: The deprecation of OpenGL from macOS
I think most developers won't notice much of it since they'll be using a cross-platform development tool (like Renpy). Of course, Apple does everything in its power to make it hard to cross develop for its own platform. That hasn't changed. But we still have HTML 5
Anyway, graphics functions should always be wrapped in a class or some other kind of interface so that any change or addition in the underlying graphics hardware or API only results in a need to change that wrapper code and nothing else.
In the old days we had to write for different stuff like VGA, CGA, EGA, Hercules, etc. And to have any kind of hardware acceleration (if there was any at all) we had to write for specific graphic cards (like writing code for a Matrox Millenium, for example).
Worse : the graphic assets themselves had to be made specifically for each of these graphic adapters because the resolutions and the colors were wildly different. Even on home computers like the Atari ST there was the problem that some people used a black and white monitor giving 640x400 (in black and white, of course) but those people couldn't use the glorious 16 color 320x200, and vice versa because the color monitors couldn't do the high resolution of 640x400. Many had two monitors because of this (the color SC1224 and the monochrome SM124 or SM125). The Commodore Amiga didn't have this problem and had a higher resolution (by cutting the speed of the CPU in half!).
But enough of those old horror stories. My beef with Apple is not that they do this thing, Microsoft isn't a stranger to such tactics (even when it comes to those rare APIs they developed themselves like ODBC). My beef with Apple is that they're unfriendly to developers in general. You don't see Microsoft demanding money from developers for the "privilege" to use their OS. I'm surprised Apple still has a market share at all
Anyway, graphics functions should always be wrapped in a class or some other kind of interface so that any change or addition in the underlying graphics hardware or API only results in a need to change that wrapper code and nothing else.
In the old days we had to write for different stuff like VGA, CGA, EGA, Hercules, etc. And to have any kind of hardware acceleration (if there was any at all) we had to write for specific graphic cards (like writing code for a Matrox Millenium, for example).
Worse : the graphic assets themselves had to be made specifically for each of these graphic adapters because the resolutions and the colors were wildly different. Even on home computers like the Atari ST there was the problem that some people used a black and white monitor giving 640x400 (in black and white, of course) but those people couldn't use the glorious 16 color 320x200, and vice versa because the color monitors couldn't do the high resolution of 640x400. Many had two monitors because of this (the color SC1224 and the monochrome SM124 or SM125). The Commodore Amiga didn't have this problem and had a higher resolution (by cutting the speed of the CPU in half!).
But enough of those old horror stories. My beef with Apple is not that they do this thing, Microsoft isn't a stranger to such tactics (even when it comes to those rare APIs they developed themselves like ODBC). My beef with Apple is that they're unfriendly to developers in general. You don't see Microsoft demanding money from developers for the "privilege" to use their OS. I'm surprised Apple still has a market share at all
Who is online
Users browsing this forum: No registered users