Endless reports of download errors
skeleton567
Quicken Windows Other Member ✭✭✭✭
You folks all have my sympathy regarding the download errors you receive. I worked in IT for 42 years and never have I seen so many problems reported with software failures as I have on this community.
My last eleven years working were spent as a DBA for a national corporation with a dealer network across the US and in a number of foreign countries. Our network did transaction uploads multiple times a day for combined corporate billing of national accounts and reporting back to dealers. Application failures were NOT ALLOWED. Anything wrong had to be handled without failure and gracefully reported back to the dealer for correction.
I'm also remembering creating a system as far back as the mid-1980's that used little 10-key hand-held devices by Telxon Corporation for a sales force of 40 to manually enter customer orders and transmit over a dial-up phone system into our small 'mainframe' system (IBM System 3 model 10, 32k bytes of memory). We handled 700-1000 orders a day through that system and almost NEVER had any failures. And if there WERE problems, we actually called the sales person to get them fixed, just to take care of the customer.
If something was wrong, our systems HAD TO HANDLE it and get it corrected. We couldn't just throw an error and quit.
It seems to me that if Q experiences something that would cause a failure, it should report it back to the transaction originator instead of just giving an error to the user and leaving you on your own. At a bare minimum it should clearly report to you EXACTLY what the bad data is, instead of just failing.
If these data creators and providers got dinged enough, I'm pretty sure they would be more interested and proactive in getting problems solved. And if Q required more accuracy from them in order to have their data handled, things might get better.
Of course, this is assuming that the problem is not actually internal to Q.
My last eleven years working were spent as a DBA for a national corporation with a dealer network across the US and in a number of foreign countries. Our network did transaction uploads multiple times a day for combined corporate billing of national accounts and reporting back to dealers. Application failures were NOT ALLOWED. Anything wrong had to be handled without failure and gracefully reported back to the dealer for correction.
I'm also remembering creating a system as far back as the mid-1980's that used little 10-key hand-held devices by Telxon Corporation for a sales force of 40 to manually enter customer orders and transmit over a dial-up phone system into our small 'mainframe' system (IBM System 3 model 10, 32k bytes of memory). We handled 700-1000 orders a day through that system and almost NEVER had any failures. And if there WERE problems, we actually called the sales person to get them fixed, just to take care of the customer.
If something was wrong, our systems HAD TO HANDLE it and get it corrected. We couldn't just throw an error and quit.
It seems to me that if Q experiences something that would cause a failure, it should report it back to the transaction originator instead of just giving an error to the user and leaving you on your own. At a bare minimum it should clearly report to you EXACTLY what the bad data is, instead of just failing.
If these data creators and providers got dinged enough, I'm pretty sure they would be more interested and proactive in getting problems solved. And if Q required more accuracy from them in order to have their data handled, things might get better.
Of course, this is assuming that the problem is not actually internal to Q.
Ó¿Õ¬
Faithful Q user since 1986, with historical data beginning in 1943, programmer, database designer and developer for 42 years, general troublemaker on Community.Quicken.Com0