Squeezecenter 7.0 Gold sound quality?

0 Members and 1 Guest are viewing this topic. Read 12848 times.

JEaton

  • Full Member
  • Posts: 472
Re: Squeezecenter 7.0 Gold sound quality?
« Reply #60 on: 23 May 2008, 08:44 am »
That's really kind but they don't do a version for Mac. Also, does anyone know the Mac equivalent of the my.tt file mentioned here, please?

Well then, I guess that sucks.

larrettp

  • Jr. Member
  • Posts: 22
Re: Squeezecenter 7.0 Gold sound quality?
« Reply #61 on: 23 May 2008, 12:53 pm »
I can connect to the C drive on the ReadyNAS and I've removed the line from my.conf - I looked for my.tt but couldn't find it (any idea which directory?).

I restarted the NAS and I can still connect to 192.168.2.4 Port 22.

I am then using MySQL Administrator to try to connect to 192.168.2.4 Port 9092, user root, password, and socket /var/run/mysqld/mysqld.sock

I then get:

Connection error

Could not connect to MySQL instance at 192.168.2.4.
Error:Can't connect to MySQL Server on 192.168.2.4 (61)
Code 2003

I can Ping it OK but I'm currently out of ideas on this. If you have any, please let me know.

Thanks


larrettp

  • Jr. Member
  • Posts: 22
Re: Squeezecenter 7.0 Gold sound quality?
« Reply #62 on: 25 May 2008, 10:05 pm »
Optimisation of the SqueezeCenter database
------------------------------------------

From the outset, I was not happy with the overall performance of the Squeezebox. Mine is used in conjunction with a ReadyNAS and I was constantly told that this had limitations, which meant it was always going to be ‘slower than on other platforms’. I downloaded the SqueezeCenter software to my Mac PowerBook G4 and found that I still wasn’t overwhelmed – the same problems were there.

My main issue was that, the larger the Playlist, the longer it took to add items to it. This meant that functionality was constrained by size, which should never be an issue in a well-designed database. I should state here that I am a professional Mainframe and mid-range Database Administrator (DB2, IDMS, SQL Server and a bit of Oracle).

I downloaded MySQL Administrator and Query Browser and also switched on SQL Logging in SqueezeCenter.

Firstly, with Administrator, I looked at the tables and their Indexes. This immediately rang alarm bells. Among the issues I found were:

•   Several Many-to-Many tables – These are always going to cause problems in a database, as they are an I-O overhead to maintain and, without careful indexing, to reference.
•   The Many-to-Many tables all had Foreign Keys (RI) to their parent tables. This is pointless in a database that is not heavily transactional. Worse still, they all had ‘Cascade on Delete’ parameters. The fact that RI exists plus the Cascade parameter creates an enormous I-O overhead on the system.
•   Nearly all of the tables had inadequate Indexing for the Queries being run. This creates even more I-O and adds to the performance problems.
•   Within the code itself, there is huge repetition of effort and checking of the Catalog tables at times when no such checking is required for normal functioning.

After this, I ran a full Scan. I saved the scan log from this and trimmed it down to retrieve all of the SQL used during the different parts of the process. I then created, saved, loaded and cleared a Playlist. I saved the server log and trimmed it in the same way as the scan log.

I saved a representative sample of 40 SQL statements that were run regularly and used the MySQL Query Browser to EXPLAIN them. This showed the access paths used when each was run.

My actions were limited in that I didn’t want to change any of the code (I’m a DBA, not a programmer) and, therefore, I could only provide fixes that made the database fit the programs as they are now. If any programming changes recommended later are applied, the database can be changed accordingly.

The first thing to remove was the RI (Foreign Keys). These served no practical purpose and were a performance overhead. I did this and ran various tests with no adverse effects.

Secondly, I applied Indexes to the Many-to-Many tables (these only have 2-3 columns) in EVERY possible permutation. For instance, where there are columns A, B and C, I created indexes for ABC, ACB, BAC, BCA, CAB and CBA. Obviously, this is well overdone but, without access to the code, the overhead of creating these at the time of the scan was far less than that of creating the RI and the potential benefits when the database is READ were huge. I also removed the existing, single column, Indexes.

Next, I looked at the 40 SQL statements I had saved and ensured each table had an appropriate index for optimum retrieval for each statement. The only other Index  I added which had a big most impact was:

•   CONTRIBUTOR on the Albums table

Having carried out all of the work on the tables, indexes and RI, I used MySQL Administrator to Optimise the tables. As so many new indexes had been created, this seemed like a good idea to make sure they were properly organised.

I then ran all of the sample statements through MySQL Query Browser to compare the results. All of the statements returned the same result sets except they were visibly faster and the access paths were greatly improved.

Following this, I ran a full library scan again and created, saved, cleared, loaded and played a Playlist. The difference was remarkable. The whole system was MUCH less clunky and had no loss of functionality or any errors.

Recommendations
---------------

1.   I Strongly recommend the removal of all Foreign Keys in the database. These are an overhead providing a database facility that is inappropriate for this application.
2.   I also recommend a complete overhaul of the table indexes – especially on the Many-to-Many tables. This can, obviously, be carried out more easily with full vision of the code (which I still don’t seem to have) but, in many cases, making the Index fit the SQL in the log is not a bad place to start.
3.   Is it possible to add a database function to the Settings panels that would instigate a MySQL Optimisation? Alternatively, can this be done at the end of the scan process? Thinking about this, the scan process DOES talk of Optimisation as it’s last step but I’m not sure what it means by this.
4.   Is it possible to switch off artwork? A lot of time is spent in the programs chasing artwork that may or may not be there and the only place you normally see it is in the web interface. Personally, I’m not bothered and if the system can work more efficiently without it, I’d rather have this option.

The application
---------------

As I said earlier, all of the above database changes were made without changing any of the code. I did notice, though, that the strategies of the programs could be modified to produce the same results much more efficiently.

1.   Library scan – when a scan takes place, it creates a Catalog of tables that are used later when creating and running Playlists. The reason for scanning is to take into account and new or deleted music, Playlists etc. This takes a considerable time to run but, if it supports the system later, it has to be done. It does mean, however, that the music currently being played can be disrupted – almost to the point of being unlistenable. Is there any way that the scan can be run in it’s own memory address space to keep it separate from the part of the application which is playing the music (or vice versa)?
2.   Creating Playlists - When creating a Playlist, the programs effectively carry out ‘mini-scans’ for EVERY track to make sure they still exist!! This is really where the performance goes downhill once the database access issues have been resolved. Why is this done again? We already have the Catalog tables from the last scan and all we need to create the Playlist is the URL and the artist/album/track etc. details. If the physical files have been deleted, we shouldn’t care at this stage. We are just building a ‘shopping list’ of tracks to play in the future based on our latest Catalog. Adopting this strategy will cut out a whole load of I-O activity.
3.   Loading Playlists - When loading a Playlist that has already been created, the ‘mini-scan’ process described in ‘2’ is run again!!! Really, how often do you want to scan this database before you actually PLAY SOME MUSIC!!!!
4.   Saving Playlists – Guess what? When we save a Playlist, we perform yet another ‘mini-scan’. Stop messing about. Just save what you have got.
5.   Playing Playlists – at the time of playing, the application checks to see if the file exists (as if it hasn’t checked enough times before). If it exists, it plays it. If it doesn’t, it moves on to the next – although sometimes it hangs.

Let’s get this into perspective – taking the shopping list analogy. You need to go to the supermarket to get some food. You know the stuff that they sell so you make a list of what you want before you go. You go to the shelves for the items you want and, if they haven’t got one of them, you move on to the next item on the list. That is, in my opinion, exactly the strategy that SqueezeCenter should be adopting. With the current setup, you make your list, phone the store to see if they have everything in stock, go to the store, check with customer services to see if they still have it all and then go to the shelves to get it (or not, if you’ve spent too much time checking and, in the meantime, someone else has taken it!).

Conclusion
----------

This has taken a lot of work in the face of many on these forums taking the attitude that ‘If you’re running a ReadyNAS, what do you expect’! Maybe it’s not the fastest piece of kit on the market but I knew that when I used it for backup and retrieval purposes it showed no problems with response times. Therefore. The software had to be – at least – partly to blame. What I have presented here is only a workaround solution but it gives a lot of pointers to the kind of changes the developers should be focussing on in the future. Remember, the whole idea of this kit is to PLAY MUSIC – not to endlessly wait for an unresponsive machine to maintain an over-engineered database.

Caveat
------

If you feel confident to make these changes yourself, good luck. I have had nothing but good results from them, so far. However, I take no responsibility for any loss of data or service from your setup. Let’s hope the developers take notice of this and incorporate these, or similar, improvements into the released versions of the software.