Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Optimizing RigelEngine’s OpenGL Rendering for the Raspberry Pi (lethalguitar.wordpress.com)
40 points by ingve on April 22, 2021 | hide | past | favorite | 7 comments


With such a low resolution, I'm wondering if it wouldn't be faster and simpler to just do the whole rendering on the CPU and just blit and upscale the result with the GPU.


The bit about optimizing his blit shader surprised me (as a 20+ year graphics dev) - the few extra multiply-adds should be virtually free on most architectures as that small a shader would generally be ROP-limited.


Oh and yeah the remaining glBufferData would def be on my fix-next list. Unless the tilemap changes dynamically, all those tile triangles should be grouped into chunks in one "vertex atlas" and their screen position controlled by one uniform per chunk.


I suspect his bottleneck is glBufferData and he should look into buffer orphaning.


Or look into uploading all the data at the start of the frame and using glBindBufferRange.


The name "RigelEngine" is nice in itself, but the headline does not reveal what this actually is: it's a reimplementation of the code for Duke Nukem II (using existing data files).


>I thought rendering it in 1080p could allow for future high-res mods, but it ran too slowly on Raspbeery Pi, so I switched to the original resolution instead.

sums up the whole article.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: