consumption of resources.

Tristis Oris's picture

Hi Nir. I know these questions are asked often, but can we expect in the future to reduce the consumption of memory and disk access?
I have a lot of music and I do not mind it sharing, but then comfortable working at a computer is not possible. any drive is the weakest point of the system and even 50 thousand files in a share is noticeably slow.

http://imgur.com/rp2tQ

is it possible to reduce the load will reduce the number of search queries from outside or allowed to mount above the norm?
I know that CT is not ideal library and consumes more resources in comparison entry for other, but it's your choice and no complaints I'm not saying.

if it helps I can provide any information on the consumption of resources, with different numbers of files.
yet we have to use the SS from another computer where it is much less music.

in any case, thank you for this wonderful resource. with any problems you can always find out =)

Hey Tristis,

This is a complicated question. I'll try to cover as much ground as possible.

As for the consumption of memory, there is very little that can be done about how much memory SoulseekQt's data system takes, but big savings may be had in the way that the data system is being used. Essentially I could lessen my dependence on it and rely more on hard-coded C structures to store file-sharing data, but I'd be losing a lot of the power I have right now to easily do things like on-the-fly filtering and efficient processing of searches. That said, unless you had some open shares and search result windows when that memory consumption figure was taken, I suspect they're either a very serious memory leak in the client I haven't discovered yet, or else this is due to memory fragmentation. Memory fragmentation is what happens when a program allocates and frees a lot of little bits of data in seemingly random order. It ends up having a lot more memory allocated than it's actually using, because it has a hard time filling those little holes in its memory space that are formed as a result. Memory fragmentation is very hard to control, but maybe something like creating a fresh copy of all the client's data and using it to replace the old copy every once in a while can help consolidate memory use. If you'd like to try it out, I'll link you to a build that does that once I implement the copy/replace process.

As for performance problems: What do you mean by noticeably slow? Does your entire computer slow down when SoulseekQt is open? There's probably plenty of room for performance optimizations in SoulseekQt, but figuring out what needs to be optimized can be difficult if I can't replicate these performance issues. The main culprit may very well be the returning of search results. You can try switching your share to private and seeing if it improves the client's performance. If so, something definitely needs to done in that area.

As for disk access: SoulseekQt only accesses your file system when it's re-scanning your shares and when actually uploading or downloading files. It doesn't really do anything that the original client didn't do in terms of file I/O. Perhaps Qt does have performance issues on Windows? I have not run into any myself.

Any observations you can offer about any of this are welcome. I'm all for keeping this dialog open in the interest of improving things where they need to be improved.

Two specific questions though: How many files were you sharing when the screenshot was taken? And what task manager is that? It looks pretty cool.

Thanks, Nir

Tristis Oris's picture

Thank you Nir.
i have a Windows 8 now, its a new style of interface=)

OS installed on the SSD drive, the music is on a separate drive to 3TB SATA3. Total number of files of about 250 thousand. But the problem with a large load increases begin after 50-100k files. It all bearable, but I'm a little worried about the life of a disk)
On my other PC shared around 150Gb and everything is okay.

Considering that few people fumble over 5-20 thousand files, and not have more such difficulties, few notice.
Possible on your main forum being what or discus on this subject, but I'm not good at English choen to read it)

To say that the computer is slower working is not quite true. I "feel" that it consumes more resources. But work with the disk where the music is very difficult.

Another little detail, it was in the old client and now too, it periodically freezes for 5-10 seconds, and the more people the more you hang swings. perhaps this is due to the completion and start downloading files.

If I can help you any statistics, data, or test, I'll gladly do it. Have great respect for your product, and I think it should evolve. Also willing to test the beta version of the product if it is somehow able to help.
I have the opportunity to test the client on different operating systems and computer configurations.

Tristis Oris's picture

Another little thought. looking for logs of the SS, a search request receives a response from the user's computer, so spending resources on finding \ answer to deny that which is not. if I understand correctly, please correct me.

And that if you create an index file (if it is not) as the search engines, which will group the names of files \ folders and etc. with links to users who have them.

Well, 250k is certainly a lot of files. It will definitely use a lot more memory than Soulseek NS did, and I don't think there's too much that can be done about that. But I can say this about disk access: it shouldn't be any different than Soulseek NS. Your file system is only accessed once when your shared folders are being scanned at the beginning of a session, and whenever a file is being downloaded or uploaded. Files are read in the same way Soulseek NS reads them, so unless I'm missing something (which is possible), you shouldn't have any new problems with SoulseekQt. As for performance issues, SoulseekQt should be processing searches about as efficiently as Soulseek NS. Basically your client has a list of individual tokens associated with sets of shared files that it cross-references to get the right results. Looking up that list of tokens is really 99% of processing search requests. With so many files shared it's a huge list, and one thing that can possibly be done for it is to change the data structure used to index these tokens from a binary tree to a hash table, which should be faster.

But considering all these things, and as hungry and active SoulseekQt appears to be on your machine, it's really not too much to expect considering how much it appears to be doing, how many files you're sharing and how fast you're uploading. Especially considering Qt is as non-platform specific as it is. I'm willing to try anything though, and if you have any more information you think might be relevant please don't hesitate to tell me.

I'm not sure I understand your last question. Maybe my explanation of the search engine above answers it? If not let me know.

Thanks, Nir

Tristis Oris's picture

I mean indexing like Google. to give only requests from the file without using the disk.

Yesterday I tested the SS time and noticed that after 6-8 hours (5 upload slots and 3MB\s) the number of requests to the disk and resource consumption processor significantly reduced to a comfortable level.
I always copy to disk, and edit the tags of files and objectively know the response time for any operation.

[links removed]

Task Manager of course I do not ulchshee means to collect such data, if you can tell what to collect more real information, I'm willing to provide it.

How do you think. whether to compare the performance of the SS when you turn indexing windows? Because I have it turned off.

I think I'm gonna have to admit defeat with this one. SoulseekQt doesn't access your hard drive to respond to search requests, that's all done in memory. I'd suggest it might be the newly implemented background re-scanning of shared folders when the client starts that might be causing this disk load, but the build you're using doesn't have that. Windows indexing shouldn't have an effect on this I think.

Sorry I can't be of any more help.

edit: There's one more thing I think may make a difference and that's the writing of the configuration file every five minutes. With so many files shared yours is probably really big. Let me link you to a build that only saves configuration information when the client is shut down. I'm very doubtful it will have a significant effect on resource consumption, but it's really the last thing I can think of. Be posting the link in a little bit.

Here we go: http://www.slsknet.org/SoulseekQt/Windows/SoulseekQt-9-25-2012.exe

Tristis Oris's picture

Its a great! Big thanks Nir!
Client starting by a 5-10sec, before it took a 1-2 minutes!
On the weekend I will be able to give a more detailed review) but now I can say that the SS in the assembly is faster. Memory consumption remained the same but for me it does not matter. Now I watching on RDP from job how SS working, and can't say about locked\freeze up, but I'll check all aspects of the SS.

Glad to hear the background scanning's working well over there! Like I said, not having it save the configuration data every five minute may not make a difference, but let me know what you find.

Thanks, Nir

Tristis Oris's picture

everything is wonderful. memory consumption, as I wrote has not decreased, but more running SS does not and reminds himself.
freezes, already noted, is no longer observed. only in the first minute of loaded until all services. and it is also good.

I believe that keeping information only when you exit the client is a good solution, because in the end it makes no difference. unless you happen to crash.

thank you very much Nir, that we were able to understand my problem and a proposed resolution.

Excellent! I'm really glad we've managed to take things as far as we did. As you might have read in the release notes for the 9/28 build I've settled on saving the configuration data every hour. If this is still causing problems for you, let me know and I'll add an option to turn it off altogether.

Thanks, Nir