Q: Working with large amount of files - deadlock.

Rockfan

New member
Joined
Feb 7, 2007
Messages
4
Programming Experience
1-3
Hi all,

i'm somewhat in a pinch at the moment with one of my recent creations...

The general idea was to create a small, simple tool to convert files from one format to another. More specifically, large image files such as BMP to JPG.

After writing this, I took it for a test run with 30 files. Everything worked OK it seems. I decided to test it with 300 files. At first, it was working ( sure it took some time, but it worked ) but after a while i got an exception from VS2005.

VB.NET:
ContextSwitchDeadLock
The CLR was unable to switch COM...

I did some research and this seemed to be an option which could be disabled in VS which I did. On next run, I didn't get the exception but the application crashed around 80% images processed.

How am i handling this : ( I will refrain from posting entire code )

Recursive scan of folder ( user input ), store paths of files into ArrayList, use these paths as input for the converter function.

Is there a better/more performant/safer/... to handle this, because this way obviously is not working for large amounts of files.

Many thanks in advance.
 
I think it is happening because app (during debug) is running a loop in UI thread for long without releasing itself for windows messages to be handled. If this is the case, use either Application.DoEvents in the loop to release for message pumping, or better run worker processing as a separate thread.

The other reason mentioned at MSDN is memory release problems, either because you are not disposing used objects or not letting garbage collector through because of continuous loop that prevents message pumping.
 
Many thanks for the fast reply :)


I tried it today and by God, it fixed it... I also took your advice and remade the application to work with threads. I am now however faced with a big load on my CPU and RAM. About 150Mb RAM used while converting and 100% ( on 2.8GHz cpu ). I limited the maximum simultaneous threads to 10, which improved it a bit, but still running at 120ish Mb/ 100% CPU.

Any further ideas to improve this? :confused:
 
A single worker thread looping the work would be more memory conservative.

You could try "easing" the thread a bit by letting each loop sleep just a little while, for example 100ms call Threading.Thread.Sleep(100) each loop.
 
Back
Top