These forums are read-only. Please send questions and feedback by email.

Stress testing V

Use this forum to ask any questions and to submit bug reports

Moderator: vuser

jrbarnett
Posts: 5
Joined: Sat May 10, 2014 6:08 am

Stress testing V

Postby jrbarnett » Sat Nov 28, 2015 11:17 am

So, the V website says that it can handle 100Mb or even 100Gb files with ease. I decided to put that to the test.
I used the technique at http://www.windows-commandline.com/how-to-create-large-dummy-file/ to generate a single 128Gb text file with the same line repeated. GNU win32 tools wc -l reported total number of lines in this file as 2147483680.

My test kit was: Core i7 4790, 16Gb RAM, 2Tb disk, Windows 8.1 with Classic Shell and V14 SR7 x64. Note this was physical hardware, not a VM.

Opening it in the test environment I could scroll around quite easily, search for content etc. Counting particular instances of a search phrase I gave up after around 26% searching as it took too long - this seems to be done as part of the same thread as the main application. Going to line 1000000 took a few seconds but got there easily enough. Using the goto command to return to the last number -1 again I gave up, but didn't get any pro.

The vertical scrollbar appears to show position within each files chunk rather than the file as a whole, I had to click down after the end of the file to move to the next chunk.

While I had this running I had task manager open. It seems ludicrous that on a system with 16Gb RAM and only 14% RAM usage in total, I couldn't get V to use more than 2.4Mb memory when there are obvious benefits to cacheing parts of such a large file. This seems to give little benefit to moving to the 64 bit version, whereas I would have thought files of this size would benefit greatly from this.

Attempts to get more diverse set of realistic data such as the complete works of shakespeare or dickens from Project Gutenberg as a base point for duplication made creating the file rather more difficult to give round numbers as they were 5.1 and 6.4Mb respectively so didn't scale well to give round numbers as a result. Another possible candidate for large scale testing would be a wikimedia dump such as those available from http://dumps.wikimedia.org/.

Overall, I'd be the first to admit that my techniques bear no resemblance to my own real world usage, and that therefore in some ways these results don't have any useful meaning. However, in that respect they are no worse than other stress testing colleagues and I have done using jmeter against web applications. My results do prove the websites advertising saying that this application can handle 100Gb files though.
I've probably spent more time stress testing my PC rather than V itself.

John

FileViewer
Site Admin
Posts: 287
Joined: Fri Apr 30, 2010 5:50 pm

Re: Stress testing V

Postby FileViewer » Sat Nov 28, 2015 8:12 pm

Counting particular instances of a search phrase I gave up after around 26% searching as it took too long

Unfortunately, searching a 128GB file takes time. There is not much I can do to speed this up (short of indexing the file).

Using the goto command to return to the last number -1 again I gave up

Going to the last line number in the file takes time because I need to parse the entire file so I can count the lines!

BTW: If you wanted to go to the end of the file, you could click on the Last Chunk button on the toolbar. This would take you immediately to the last line in the file (although you wouldn't know the correct line number).


Return to “General Support”

Who is online

Users browsing this forum: No registered users and 18 guests