Quicken Data in the Cloud
I'm sure this isn't a new idea, but here goes anyway:
I currently (and for 20 years) use Quicken Classic on a home-based Windows PC. I'd like my primary Quicken data file to be stored in the Cloud, so I could run full Quicken Classic from a Windows or Mac laptop when I am away from home. I would never run two simultaneous sessions, if I tried then just prohibit me from launching the new session.
I suppose I could do this today by loading a data backup file on a second instance of Quicken Classic, but then if I neglected to load the primary PC from backup when I return home, ugly data corruption would occur.
If implemented, this would be my all-time biggest Quicken enhancement.
Ken
Comments
-
This is an interesting idea.
I have not experimented with this, but I suspect that Quicken already tries to lock the data file while it is in use. However there are repeated reports and suspicions of file corruption that indicate that this is not 100% reliable. For example, it is widely assumed that OneDrive, Dropbox, and other cloud storage systems can cause corruption when they try to access the data file to synchronize their cloud copy to the local copy.
How would Quicken know that the file is in use by another system?
If the other copy fails to release its lock, due to a crash for example, how could the locked data file be unlocked?
QWin Premier subscription0 -
Note that Q expects it's data file to be local to the PC and thus lacks the "Lock", "Commit" and "RollBack" commands that are be necessary for ANY safe remote processing.
Furthermore, QWin and QMac use entirely different file structures and the data file of one can't be used by the other, so "full Quicken Classic from a Windows or Mac laptop" simply can't be done.
Q user since February, 1990. DOS Version 4
Now running Quicken Windows Subscription, Business & Personal
Retired "Certified Information Systems Auditor" & Bank Audit VP0 -
You mentioned Windows and Mac laptops. Quicken Windows and Quicken Mac data files/programs aren't compatible with each other so whereas it is possible to setup a system where two Quicken Windows laptops get "access" to the data file at separate times, there isn't any way to do that from a Mac laptop unless it is running Windows in a Virtual Machine. And the reverse is true for Quicken Mac.
@Jim_Harman is correct that the lock Quicken puts on the data file will not carry over to services like the "cloud folders" on the remote machine(s). It "somewhat" carries over to network drives but given all the possibilities of different network configurations this locking isn't 100%. And to compound that problem is what Quicken is doing with the locking. Basically, Quicken is depending on the fact that when it opens the data file it uses a flag to let Windows know to not allow any other application to access it. But there are times where Quicken will close the data file (which removes the lock) and then expects to open it shortly afterwards, which creates some short periods of time where the data file isn't locked, and if the "cloud service" goes to do the sync at that time it will cause problems.
I can see a way to implement this though, but I doubt Quicken Inc would implement it. That wouldn't stop the user from doing it though using a script.
Let me say right off the bat, you do not want to use backup and restore for this. Restore "syncs the data file" and repeated use of it can cause problems.
Your actual goal is to always use the most current copy of the data file. Quicken doesn't have problems "syncing" from different locations. What it has problems with can be shown this way. You have to imagine your data file as being split into two parts.
Part A, the data file on your computer.
Part B, the data that is in sync with it in the Quicken Cloud dataset.
You open your data file and do anything with an online service you have to assume something matching was changed in the Quicken Cloud dataset too.
If you take a copy of the current data file on another machine and it connects to the Quicken Cloud dataset, that will work fine. But as you noted if that later you now connect with that "unchanged data file" on the first machine they will be out of sync and that can cause problems.
What will get you 95% to this goal is simply automating copying your data file from/to the cloud folder.
With this, as long as you close the data file on each machine after use (both of which are setup with this automation), it should work fine.
The "script" looks like:
- Copy data file from cloud folder to local folder.
- Run Quicken with the local data file (script wait for Quicken to close).
- Copy data file from local folder to cloud folder.
To get to about 99% would be to introduce a lock file. It isn't 100% because timing issues, but assuming that the opening of the data file on the other machine "takes time" it would be very close to 100%.
Above just changes to this:
- Check if the there is a lock file in the cloud folder, and if is there print a message and exit.
- Create the lock file in the cloud folder.
- Copy data file from cloud folder to local folder.
- Run Quicken with the local data file (script wait for Quicken to close).
- Copy data file from local folder to cloud folder.
- Delete the lock file in the cloud folder.
The following script does this. To use this, you would create a file with Notepad and copy this below into the file. Then when saving it out make sure you change the .txt file type in the name to .cmd. You can call the file whatever you like. You need to edit DATA_FILE, LOCAL_DATA_DIR, and QUICKEN_REMOTE_DIR to reflect your setup.
@echo off
set DATA_FILE=Current.qdf
set LOCAL_DATA_DIR=c:\Quicken
set QUICKEN_REMOTE_DIR=%USERPROFILE%\OneDrive\Documents\Quicken
set LOCK_FILE=%QUICKEN_REMOTE_DIR%\%DATA_FILE%.lock
IF EXIST "%LOCAL_DATA_DIR%" GOTO DIR_EXISTS
mkdir %LOCAL_DATA_DIR%
:DIR_EXISTS
IF NOT EXIST "%LOCK_FILE%" GOTO NO_LOCK_FILE
start /wait cmd /C "ECHO %LOCK_FILE% exists, Exiting. && PAUSE"
GOTO SKIP_DELETE
:NO_LOCK_FILE
ECHO "Quicken Lock File" > "%LOCK_FILE%"
copy "%QUICKEN_REMOTE_DIR%\%DATA_FILE%" "%LOCAL_DATA_DIR%\%DATA_FILE%"
IF %ERRORLEVEL% == 0 GOTO COPY_PASSED
start /wait cmd /C "ECHO %QUICKEN_REMOTE_DIR%\%DATA_FILE% copy of data file failed. && PAUSE"
GOTO END
:COPY_PASSED
"%ProgramFiles(x86)%\Quicken\qw.exe" "%LOCAL_DATA_DIR%\%DATA_FILE%"
copy "%LOCAL_DATA_DIR%\%DATA_FILE%" "%QUICKEN_REMOTE_DIR%\%DATA_FILE%"
IF %ERRORLEVEL% == 0 GOTO END
start /wait cmd /C "ECHO %LOCAL_DATA_DIR%\%DATA_FILE% copy of data file failed. && PAUSE"
:END
del "%LOCK_FILE%"
:SKIP_DELETETo run this you just double click on the *.cmd file you created.
0 -
If you are comfortable having your data in the cloud, then I suggest you look at Quicken Simplifi. I tried it out for a couple of years and it's a good product. The last time I used it was ~2 years ago and they continue to evolve it.
Some things it does better than Quicken Classic
- Data entry is faster than quicken desktop.
- It does a better job of auto-categorizing.
It does track investment accounts. The reporting is not as flexible. When I was using it, they did not have cost-basis tracking, which was one of my reasons for migrating. I just looked — they added that capability shortly after I migrated to quicken classic.
I like to explicitly accept my downloaded transactions one at a time so I can review them vs. having them automatically put into the register. Simplifi doesn't have this concept, but I was able to get to the same outcome by tagging reviewed transactions with a tag named "processed". Whenever a new set of transactions were downloaded, I would simply search for everything that didn't have the tag "processed".
Food for thought.
Marc
1 -
wonderful insights. Thanks!
0


