Posted 17 years ago by Matt Adamson
Avatar
Guys,

I have a high spec pc e.g. core duo 6700 with 4GB ram and having performance issues using two SyntaxEditor controls loading C# code with only 3000 lines in.

I'm using AppendText to append one line at a time. Should I be setting some property before calling AppendText multiple time to improve performance e.g. perhaps the control is doing some intensive work every time text is appended. If this was performed at the end when all the text was added perhaps this would be quicker.

Just loading 3000 lines takes nearly 10 - 11 seconds which is extremely slow.

I also tried disable sematic and lexical parsing after reviewing the documentation section, specifically on handling large files i.e.

// TODO Performance optimisations trying out.
leftSyntaxEditor.Document.Outlining.Mode = OutliningMode.None;
leftSyntaxEditor.ContentDividersVisible = false;
leftSyntaxEditor.Document.LexicalParsingEnabled = false;
leftSyntaxEditor.WordWrap = WordWrapType.None;
leftSyntaxEditor.Document.SemanticParsingEnabled = false;


However this appeared to make no noticable difference.

Any thoughts on ways to improve this would be appreciated.

Comments (8)

Posted 17 years ago by Paul Fuller
Avatar
Matt,

At a guess it is your approach to appending one line at a time to the document.

Try just adding all of the text in one go by assigning it to the Text property.

If you are only receiving one line at a time then change that method or us a StringBuilder to accumulate it before adding to the Document.

Hope this helps.

Paul
Posted 17 years ago by Actipro Software Support - Cleveland, OH, USA
Avatar
Hi Matt,

Any time you make a modification you are running through the full process of a change, meaning it's doing lexical/semantic parsing, outlining, undo item additions, invalidating the display lines, moving scrollbars, etc. The Document class does have a LexicalParsingEnabled property you can set to false while making bulk changes however that just suspends parsing and the other normal control pieces will still be updated.

As Paul said, the proper way to do this for ultimate speed would be to use a StringBuilder to build up the full text of the document and then assign it to the Document.Text property all at once.


Actipro Software Support

Posted 17 years ago by Matt Adamson
Avatar
That certainly explains why the performance issues are so prevalent.

Using AppendText a line at a time is really preferable because of the way I'm building the document. i.e. I append each line, then access the DocumentLine object and highlight specific lines, change the custom line number.

I could certainly investigate changing this, and perhaps using another structure to help when going through all the code a line at a time however wouldn't it be relatively easy to add a property to the SyntaxEditor control, or the Document inner property, called something like UpdateGUI which you could set to false before adding content, then back to true when finished.

The .NET framework and other frameworks such as MFC support this e.g. when adding line items to a combo box / list box to prevent flickering.

[Modified at 06/27/2007 10:43 AM]
Posted 17 years ago by Actipro Software Support - Cleveland, OH, USA
Avatar
I would recommend adding all the text at one time and then looping back to make your UI updates. We do have some methods like SyntaxEditor.SuspendPainting, SyntaxEditor.SuspendLayout, etc. that you might want to call and then use their Resume methods after.

But better yet, why not create a Document and do all your work on that and then add the Document to the SyntaxEditor after your work is done. Then you don't deal with the UI updates at all except for one time. However the document will still be parsing each time you update its text so you may want to suspend lexical parsing as mentioned in a previous post if you do add line by line.


Actipro Software Support

Posted 17 years ago by Matt Adamson
Avatar
Thanks for the suggestion everyone

The most improvement was made by creating a separate Document object first and then adding to this however two things made very little difference i.e. only milliseconds as opposed to seconds difference.

a) Resume / Suspect calls - Neglible difference
b) Disabling semantic / lexical parsing before and enabling after

I was suprised with b though so are you sure the parsing is performed every time the AppendText method is called? If the Document isn't attached to the SyntaxEditor control perhaps it's not parsed at all.

In any case I'm getting reasonable performance now i.e 1 - 2 seconds for two editor controls parsing a 3500 line file.
Posted 17 years ago by Actipro Software Support - Cleveland, OH, USA
Avatar
FYI... Parsing will always take place (unless you've disabled it), even when a Document is not attached to a SyntaxEditor. However lexical parsing is pretty optimized so that when you make a change it doesn't scan more than a line or two if it doesn't need to. Perhaps the optimizations are why you didn't notice much change when disabling parsing.


Actipro Software Support

Posted 17 years ago by Matt Adamson
Avatar
Thanks for all your help, yes that's great I assume there are optimisations being made here.

The main performance issue therefore was the updating of the user interface drawing in the control which is helped by adding the Document to the SyntaxEditor after updates.

Perhaps you could add this tip to the documentation in the section "Large File Handling"?
Posted 17 years ago by Actipro Software Support - Cleveland, OH, USA
Avatar
Done, thanks!


Actipro Software Support

The latest build of this product (v24.1.0) was released 1 month ago, which was after the last post in this thread.

Add Comment

Please log in to a validated account to post comments.