Optimize latency while successively displaying images from a massive image database


New member
Feb 13, 2013
Programming Experience
I am working on a beta version of a speed reading application. Its called Etopia Ebook Reader. What it does it combine the concept of an ebook reader, and a visual dictionary, into one convenient speed reading and studying tool.

Please watch this youtube video to see what it does, so I don't have to spend a huge amount of time explaining the concept: Etopia Ebook Reader pre beta presentation - YouTube

The following code takes place within a loop, simultaneously loading a single image and adding the new image flash card to the picturebox. Each time the loop
moves on to the next increment a new flash card is loaded then drawn at a new location.
mycurrentjpgfilestring = installpath + "image_data\" + currentword + ".jpg"
currentflashcard = New Bitmap(Image.FromFile(mycurrentjpgfilestring), 200, 140)
Graph = Graphics.FromImage(mypagebitmap) 'mypagebitmap is a blank white bitmap sized to fit the picturebox, and is where flashcards are drawn
Graph.DrawImage(currentflashcard, currentxlocation, currentylocation)
PictureBox1.Image = mypagebitmap

This philosophy of the past snippet of code says that I can't fit the entire 10gb of images onto ram before the picturebox rendering, so it simplistically loads a single image and adds that image to the picturebox simultaneously, and therefore doesn't cause any out of memory problems.

However, I would like to use as much ram as available, to preload image contents so that the actual picturebox displaying can be done straight from a bitmap datatype, instead of from accessing the image from the hard disk .jpg files. I can't decide how to do this, because the database is 10gb, and I have 6gb of ram, so to prebuffer images, I would have to selectively either A) allow only some of my images into ram based on their percent occurrence in the English language, and leave the rest to be accessed later from the disk or B) preload the first x number of bytes from every image in my database into an array of byte arrays, and when it comes time to display a image, load it by combining the prebuffered bytes, with the left over bytes from that image file on the hard drive starting at the index where the prebuffering left off. The second option would force some of each image file into ram, and therefore, reduce the file size for the remainder of loading through the disk, and therefore reduce disk access time. The first option, however logical it may seem, will take a long time to decide what words are worthy of being loaded immediately into ram.

So, what do you think, should I go with plan A) or plan B), and if you say plan B) could you give some insight into how I would store the pre buffered bytes, and how I would combine them with the post buffering bytes, and finally convert that total set of bytes into a bitmap datatype?
Top Bottom