GPU Fuzzing Case Generation
It started out as a lame joke on twitter in reply to @dakami’s post
dakami: There is a peculiar point, working with any protocol, at which you go native in its hexadecimal representation. Then things get interesting.
adamcecc: @dakami so you’re a hacker and u want to go wee but you don’t have drugs yet? map each byte to different asci color and pipe to a xterm-256
Then I had an WAIT A SECOND moment.
I realized that if I’m mapping bytes to colors. I can map the colors to textures. Those textures can be loaded into a OpenGL environment and…. wait for it….
Manipulated with the hardware optimized in the GPU. Then we can manipulate the textures in interesting ways, for example shading, warping, lighting, raytracing we can use the GPU as a super high speed “dumb” fuzzing case generator. With a bit of logic we can map different parts of protocols to different verticies and swap the nodes around. Then render our fuzzed textures on top of the protocol matrix.
Node length alterations in the matrix = size fuzzing. Texture shading = data mutation. Rearranging nodes = swaping around parts of the protocol.
(All that coming soon)
This is a raw HTTP get rendered as a raw texture onto a vector in OpenGL.
This is one iteration of the same texture rendered with a very small alteration.
Quite the difference no? So here we have a very exteme case of data manipulation however we can easily imagine if that was mapped to a much larger polygon grid.I need to make an interface for Peach to access this then clean up the code but I’ll be releasing it as soon as time permits.