AltME: R3 Building and Porting


I guess same is on iOS....the "draw' api is also not using GPU
The HW acceleration is still currently mostly used only for moving textured bitmaps...kind of fast blitter in the 2d graphics area.
Ofcourse if you want to create classical 3d game from textured polygons, shaders etc. that's the area where OpenGL excells (and the developement was mainly focused)
But good news is some people are experimenting with shaders to created 2d oriented engines but until the main players on the market won't set some standard it can still take lot of time.
MaxV, I made a couple tiny edits to clarify .  I hope that's ok.
Couldn't DRAW work the same way as it does now, but render to a bitmap?  Then that bitmap could be rendered by OpenGL like it does any other bitmaps.
Yes, that's actually the method I'd like to upgrade the View engine...but not only for DRAW. The code would be modular so you can render to textures using any other library if you want.
I believe the Android porting effort will show us what is the optimal solution. It is good to find a balance between highly optimized but not much compatible HW engine and smoething that is fast enough and can be ported without big pain.
Just think, though, if OpenGL was the default renderer for graphics in R3, you could create a flat surface for a 2D screen, but during the same session, you could create another flat surface for another 2D screen at a 90% Z-axis orientation to the first, and then rotate from the first 2D screen to the next 2D screen in a fluid 3D way.
I did quite a few tests with OpenGL on R2, but the killer for me was the inherent delay with R2 calling the methods (functions) in the OpenGL dll.  I was still able to rotate an object created with over 1000 coordinate points 30 times per second on a quad-core computer, but that's probably 1000 times slower than you could do it with straight C on the same computer.
I toyed with the idea of writing a C-based dll that could take all the information for rendering an entire data structure from R2 so R2 would only have to make one method call to a dll per frame instead of thousands, but couldn't get enough higher priority items off my list to get started on it.
I don't say you should use OpenGL for DRAW. There is no drawing api in Stage3D as well. I was proposing to use AGG to draw to bitmap with the best possible quality and let the HW do what it's made for.. to move pixels around. Bo understands it. That's actually how are modern apps made to have 60fps. To make animations just in AGG will not work (and honestly never worked). But maybe I'm too involved in animations and for many people would be enough to have static content. But that is not for future.
But of course, it's not an easy task. One must be careful to make it right.. all the draw command batching, texture packing etc.
Oldes, I don't argue with you and Bo about that. I think we all know the state of this technology. I've already did several prototypes of such "engine" so I have some ideas how this could be done for R3 it's just matter of prioritizing&time&resources.
I wrote about the drawing apis just so other people know OpenGL is not any Messiah if you want to do hi-quality 2d vector graphics in realtime.
I'm not against HW acceleration at all. It's just not easy topic in this non-ideal programming world as you pointed out.
I see the solution with good high quality rasterizer + HW accelerated compositing engine. That should be flexible setup at least IMHO. Plus this way also we got the classic 3d api for free.
Bo, I even tried to HW accelerate the AGG renderer code so it is completely using well and you can use draw directly inside the OpenGL context mixed with 2d surfaces or 3d objects...lot of fun. But still , lot of stuff is still computed on CPU that way. Nevertheless its still better that fully SW based renderer.
The best solution for nowadays gfx HW would be to rewrite most of the AGG code for GPU using shaders. That would be state-of-the-art 2d engine for future. But also pretty big task ;)
What do you think is the best roadmap for the graphics engine in R3 right now?  Simply port VID to R3 to start, and then in R3v2 change out the graphics engine with hardware-based code?
There are plenty of possibilities here.
Either port VID and have to deal with it's flaws and the history with it
or go the path of the RebGUI
or redo VID
I have read somewhere that Carl expected someone to come up with something better than VID.
I like VID yet it has its oddities, like when positioning elements using 'at. It could be improved in some of its behaviours, if you import it you may be hindered by this aspect, and it may get harder than restarting with a restricted base set of widgets.
we are out of topic here probably
I don't think we're really out of topic here as the graphics stuff pertains to porting to different platforms, but if you wish, we could move this to the View/VID group.
When Carl was developing VID, he clearly expected that VID would not become the de-facto standard for Rebol graphics.  The face engine was the de-facto standard, and VID was simply one of what he expected to be several dialects for the face engine.  There were a few others, like GLASS, that came about.
Bo, I think if we don't make drastic changes to the GOB mechanism we should be safe when building anything on top of the GOB datatype. The gob! is in fact abstraction layer between the "VIew engine" and any "GUI framework" written in REBOL.
So as take this example:
We have now R3GUI framework which runs quite well on the current View engine (although this engine was build in 2 weeks during the very early R3 alpha work so it's kind of Q&D prototype ;))
(BTW should I mention the R3GUI is much better than R2 VID?)
Anyway, the R3GUI works on current View engine. When I tried to change the engine so it uses OpenGL accelerated AGG the R3GUI still worked without any problem (except visual bugs caused by incomplete OpenGL code implementation of the new prototype).
SO from that example you can see the "View engine" and "GUI framework" are two independent things and can be developed or even exchanged separately.

Last message posted 91 weeks ago.