MySQL Size and Optimization Question

5 replies
Hi,

Firstly, apologies for what's probably a newb question but I'm really unfamiliar with mysql DBA.

I have two related questions and wondered if someone could help

1) Is there any potential damage in running a daily php cron job to OPTIMIZE all the tables in a database? I'm trying to reduce the overhead to keep the DB size to a minimum

2) I'm trying to get my head around storing a large amount of data - approximately 300,000 records that, from extrapolating the 1,000 records present, will take up about 750MB space for that table. Has anyone got first hand experience in doing something with this many records? Is the speed slow down likely to be highly noticable (i.e. greater than 3 second waits for queries)

In relation to number 2), if anyone also has a suggestion (utilising another server / upgrading ram) that would help I would be grateful

Thanks

Justin
#mysql #optimisation #question #size
  • Profile picture of the author webpro4hire
    Hi Justin,


    1) depends. How many tables are in your database and how big are each individual table. Also, what's the frequency of of lookup on those tables? Are they mostly reads or insert/update/deletes. Another consideration are the table indexes, on big tables with many indexes, optimisation ca take a long time, locking the table.

    Do ALL tables really need to be optimised? Usually only tables with lots of crud activity and indexes need optimisation. And not as often as daily. I've worked on databases with millions of records having thousands of requests per seconds and we did not optimise tables daily. A lot has to do with the initial database/ table design. Not having this information makes it difficult to answer your question completely.


    2) Again depends on many factors such as table design, is the table mostly a read or does it have lots of insert / update / deletes concurrently. The field definitions and usage (doing selects on TEXT columns is a lot slower than INT columns, multiple where clauses, joined tables, etc...)

    300,000 records is not a lot of records. We've had 3 million+ records tables @ 3 GB that returned data well under 1 second. Database and table design, query optimisation is where the secret lies...


    Hope this helps.
    WP4H
    {{ DiscussionBoard.errors[3186863].message }}
  • Profile picture of the author leiden
    1. No possible damage but no possible advantage imho. Depending on the size of your database this may take a considerable amount of system resources (CPU and IO) and would not produce a considerably smaller database. Even running it once a month is probably an overkill.

    2. Just make sure you use properly created indexes and you'll be fine, MySQL can handle tables of that size easily.
    {{ DiscussionBoard.errors[3187321].message }}
  • Profile picture of the author mikeonrails
    The secret is in the indexes. Mysql never has to scan the table if you have indexes on the right columns. You can figure out what to index by taking your top select statements and putting explain in front of them. For example:

    If you normally use SELECT prices FROM products; then put EXPLAIN SELECT prices FROM products;

    You'll get a table on information on how mysql plans to perform the query. If you notice that it is scanning all the records or not using any indexes, then the response will be slower than necessary. I agree that 300,000 records @750MB is nothing to worry about.
    {{ DiscussionBoard.errors[3187345].message }}
  • Profile picture of the author Ken Durham
    e-commerce site?
    Specific script you are using?
    Hosting type?
    If it's a free shopping cart like zencart or something of that nature then damn well bet ya it will slow it down if not on a dedicated server. It won't be MYSQL though, it will be php slowing you down. The modifications that I had to make to zencart were unreal in order to get it to perform on over 100,000 records on a shared server. Some of the data handling I completely rewrote. I eventually went to a dedicated server.
    Also, you're not going to run at decent speeds until the server is optimized for parsing if you're running a script like that. I fought this battle before mistaking it for MYSQL congestion....though optimizing the MYSQL server is suggested.

    Proper coding and optimizing your server are key to good speeds. Then you can do millions of records. Bad code and default server configurations and you may have headaches with a few thousand records.

    These are your friends in time of trouble and testing:
    http://dev.mysql.com/downloads/workbench/

    MySQL :: MySQL 5.0 Reference Manual :: 7 Optimization


    Bi-weekly or monthly optimization.
    Signature

    yes, I am....

    {{ DiscussionBoard.errors[3191166].message }}
  • Profile picture of the author Flipfilter
    Thanks Everyone.

    I went away for the weekend and didn't expect to get answer so quickly but I'm genuinely amazed by all the help.

    I think optimization is the key and I've possibly been worrying about the wrong thing, especially now I have to figure out how to optimize the process of retrieving age and page rank for everyone of those records :confused:

    Many Thanks
    {{ DiscussionBoard.errors[3192891].message }}

Trending Topics