Thank you for you assistance.
Ibrahim
I was running into the same issue with a 27 MB file (which is not uncommonly large for a file editor).
However I analyzed your code (v288) and detected a performance bottleneck.
The method Document.GetText() replaces the new-line characters on the string instance, while working on the string builder is much better. The problem is you have to create a new string builder instance. For small files this could be slower, but I think the overhead is insignificant. Additionally the OutOfMemoryException is fixed too. I think this is a small fix you should add for the next maintenance release.
//EditorDocument.SaveFile(FullFilePath, Encoding, LineTerminator.CarriageReturnNewline);
using (var sw = new StreamWriter(FullFilePath, false, Encoding))
{
var content = new StringBuilder(EditorDocument.GetCoreTextBuffer().ToString());
content.Replace("\n", Environment.NewLine);
sw.Write(content.ToString());
}
Hi Tobias,
Do you happen to have any small test projects showing perf comparisons that you could share with us? It might be best if you write our support address, reference this post, and we can talk more about it there.
I am sorry, but I currently don't have the time to do that. I just ran some quick performance measurements and it seems that replacing strings with the string builder is always slower.
Small file: 8ms / 10ms
Medium file: 29-33ms / 39-52ms
Big file: 220-271ms / 307-380ms
The first number is the time that was needed for the SaveFile() call, the second one the time needed for my proposal.
The "big" file was in this case ~16 MB since I cannot test the performance with the 27 MB file.
Basically the thing is that the string builder approach might be a little slower, but prevents excessive memory spikes and therefore OutOfMemory exceptions for big files. And until someone finds a better solution the safest should be preferred. Maybe you can just use this approach for big documents.
Nobody really cares about 50ms when saving a file, but an exception and thereby deleting the complete file is definetly a no go.
I mean there are certain technical limitations, why for example a file of 1GB would be hard to process. I understand that such huge files are not supported, but 27 MB is certainly not unusual.
We will go the safe way, I just wanted to let you know about the issue.
[Modified 12 years ago]
Hi Tobias,
We added some code based on your idea that kicks in after the document is a certain size. Would you like to try a preview build to see if it fixes the issue for you? If so, please write our support address and mention this post.
Please log in to a validated account to post comments.