My data file is quite big and the application is very slow

shabandri
shabandri Quicken Windows Subscription Member ✭✭

My data file is quite big and the application is very slow. Is there a best practice on the number of years info that can be kept in a data file? I was suggested to validate the file .. I dont have any action out of the validation

I have been using Quicken since 2005 and am presently on release R60.15 Build 27.1.60.15

Appreciate any pointers.

Answers

  • Chris_QPW
    Chris_QPW Quicken Windows Subscription Member ✭✭✭✭

    Define "quite big"?

    Quicken's performance almost never has anything to do with the size of the data file.

    Signature:
    This is my website (ImportQIF is free to use):

    http://www.quicknperlwiz.com/

  • Jim_Harman
    Jim_Harman Quicken Windows Subscription SuperUser ✭✭✭✭✭

    How big is "quite big"?

    How slow is "very slow"?

    Go to Tools > Account list. Click on Options at the bottom left and make sure "Show number of transactions" is selected. What is the largest number of transactions in each account type?

    Generally slow operation is more affected by the number of transactions in an account than the age or size of the file. Usually there are better options to improve performance than trying to reduce the size of your data file.

    For example, my data file is about 200 megs and I do not experience any significant slowdowns. The largest number of transactions in each account type are about:

    • Spending 9,600
    • Savings 600
    • Credit 5,400
    • Investment 950
    • Retirement 550
    QWin Premier subscription
  • shabandri
    shabandri Quicken Windows Subscription Member ✭✭

    As I had mentioned earlier … I have information since 2005 and the data file is 171 megs. It takes over 5 secs to accept one downloaded transaction. Here are the number of transactions in each account type:

    • Spending - 8500
    • Savings - 65
    • Credit - 18409
    • Investment - 2748
    • Retirement - 1352

    Thanks

  • skeleton567
    skeleton567 Quicken Windows Other Member ✭✭✭✭

    Jim and Chris, while it is true that the NUMBER of transactions MAY have SOME relation to performance, I would venture that is not most likely the root cause. I say this with a 42-year career in IT where I designed and developed database systems and supported application development for several major corporations such as Bendix Brake and Steering Corporation, Wheelabrator-Frye Corporation, HON Industries, Musco Sports Lighting, Bandag Inc, and other smaller companies in banking, distribution, retail, etc. I designed and created databases in pretty much all of the major data base development systems that exist or have existed, and created front-end applications in about seven programming languages. This background is precisely why I tend to be critical of the issues users experience with this software. I do get weary of all the blame-passing regarding problems with this application.

    If application performance is poor, the most likely cause is the DESIGN of the data storage and the DESIGN of the front-end application. We created systems with literally multimillions of transactions and hourly moved millions of records around the world between database systems with processes known as replication.

    Database software systems all have many, many design features and capabilities dedicated to performance over trillions of data elements, but the performance depends on the DESIGN of how the data is stored, retrieved, and transmitted.

    When I began my work in IT in 1968 we ran whole companies 24/7/365 on huge multiple-cabinet computers with far less computing power and data storage than my laptop I'm writing this on.

    I have to cringe at the performance issues reported when working with the miniscule numbers and quantities of transactions being discussed and at the sheer numbers of data transmission issues reported on this site.

    While it is probably true that the number of transactions MAY be a problem, there is NO EXCUSE WHY this should be the case.

    Ask yourself why it is that major world-class size corporations such as Amazon, etc can have world-wide websites which perform far better than a single-user application on a single-user personal computer.

    And don't get me started on the number of reported failures for simply connecting and downloading data from financial institutions. How would you like it if your Amazon account caused that much grief?

    Ó¿Õ¬

    Faithful Q user since 1986, with historical data beginning in 1943, programmer, database designer and developer for 42 years, general troublemaker on Community.Quicken.Com
  • omandel
    omandel Quicken Windows Subscription Member ✭✭

    I have the same issue.
    My data is from ~2005. The data file is 160MB.

    I have over 200 accounts (most are closed). The biggest accounts are one savings account with more than 6,000 transactions and two spending accounts with more than 6,000 and over 7,000 transactions, respectively.

    The response has been very slow in the last year. When I download transactions from the QIF file, it takes 4-5 seconds per transaction. When accepting a downloaded transaction, it takes 1-3 seconds per transaction. Moving between accounts is very slow.

    Any suggestion on how to optimize the data or should I split the biggest account into archive and active part>

  • Yelnak
    Yelnak Member ✭✭✭

    I'd had the same issue… 400MB data file taking 3 secs per transaction.

    This page helped me: Troubleshooting

  • Craig069
    Craig069 Quicken Windows Subscription Member
    edited January 4

    My data file is only 312MB and I was advised it is not necessary to close out years. Well this is not true if speed is important. Making a backup file at year end and starting over is the only way I know of to improve response times. The large data file was taking 5-6 secs per changed transaction. Their data structure on Quicken Business Classic for PC in my opinion is very poor. I have a customer using a commercial accounting program using a SQL database and transactions take milliseconds for a 9GB database. It's all about proper indexing. I am running a HP Xeon Z4 workstation with 16GB RAM.

This discussion has been closed.