Sage data files

Sage Line 50 ACCDATA contains a load of files, and nowhere have I found any useful documentation as to what they are. Here’s a summary of what I think they are. They’re all data files unless otherwise stated. Most of the rest are indexes to the corresponding data files.

Anyone with more information is positively encouraged to leave a comment! Presumably Sage know, but they don’t seem that keen on publishing the information.

1..n.COA

Chart of Accounts
ACCESS.DTA

Access rights for users
ACCOUNT.DTA

Control Information (stuff across all accounts – VAT?
ACCRUAL.DTA

Accruals
ACCRUAL.DTA

Currency
ACCSTAT.DTA

Account Status
ASSETS.DTA

Fixed Asset
ASTCAT.DTA

Fixed Asset Categories file
ASTINDEX.DTA

Fixed Asset index file
BANK.DTA

Bank
BANKWWW.DTA

Bank WWW data
BILLS.DTA

Bills
BNKINDEX.DTA

Bank index file
CATEGORY.DTA

Category definitions
CONTACT.DTA

Contacts
CONTINDA.DTA

Contact Records index file
CONTINDD.DTA

Contact Date index file
COURWWW.DTA

Courier Resources
CREDWWW.DTA

Credit Resources
DEPARTM.DTA

Departments
FINRATES.DTA

Credit Charge
HEADERS.DTA

Transaction Headers file
INVINDEX.DTA

Invoice Record index file
INVITEM.DTA

Invoice Line Items file
INVOICE.DTA

Invoice Headers
MISCWWW.DTA

Miscellaneous Resources
NOMINAL.DTA

Nominal
NOMINDEX.DTA

Nominal Record index file
PREPAY.DTA

Prepayments
PUOINDEX.DTA

Purchase Order index file
PUOITEM.DTA

Purchase Order Line Items file
PUORDER.DTA

Purchase Order Headers
PURCHASE.DTA

Suppliers
PURINDEX.DTA

Suppliers record index file
QUEUE.DTA

List of users currently using
RECUR.DTA

Recurring Entries
REMIT.DTA

Remittance Line
REMITIDX.DTA

Remittance Line index file
SALES.DTA

Customers
SALINDEX.DTA

Customer Record index file
SAOINDEX.DTA

Sales Order index file
SAOITEM.DTA

Sales Order Line Items file
SAORDER.DTA

Sales Order Headers
SETUP.DTA

Setup information – manager passwords &c
SPLITS.DTA

Transaction Splits file
STKCAT.DTA

Stock Category
STKINDEX.DTA

Stock Record index file
STKTRANS.DTA

Stock Transactions file
STOCK.DTA

Stock
TODO.DTA

Task Manager
TODOIDX.DTA

Task Manager index file
USAGE.DTA

Transaction Usage’s file

How to improve Sage network performance

If you accept that Sage Line 50 is fundamentally flawed when working over a network you’re not left with many options other than waiting for Sage to fix it. All you can do is throw hardware at it. But what hardware actually works?

First the bad news – the difference in speed between a standard server and a turbo-nutter-bastard model isn’t actually that great. If you’re lucky, on a straight run you might get a four-times improvement from a user’s perspective. The reason for spending lots of money on a server has little to do with the speed a user’s sees; it’s much more to do with the number of concurrent users.

So, if you happen to have a really duff server and you throw lots of money at a new one you might see something that took a totally unacceptable 90 minutes now taking a totally unacceptable 20 minutes. If you spend a lot of money, and you’re lucky.

The fact is that on analysing the server side of this equation I’ve yet to see the server itself struggling with CPU time, or running out of memory or any anything else to suggest that it’s the problem. With the most problematic client they started with a Dual Core processor and 512Mb of RAM – a reasonable specification for a few years back. At no time did I see issues to do with the memory size and the processor utilisation was only a few percent on one of the cores.

I’d go as far as to say that the only reason for upgrading the server is to allow multiple users to access it on terminal server sessions, bypassing the network access to the Sage files completely. However, whilst this gives the fastest possible access to the data on the disk, it doesn’t overcome the architectural problems involved with sharing a disk file, so multiple users are going to have problems regardless. They’ll still clash, but when they’re not clashing it will be faster.

But, assuming want to run Line 50 multi-user the way it was intended, installing the software on the client PCs, you’re going to have to look away from the server itself to find a solution.

The next thing Sage will tell you is to upgrade to 1Gb Ethernet – it’s ten times faster than 100Mb, so you’ll get a 1000% performance boost. Yeah, right!

It’s true that the network file access is the bottleneck, but it’s not the raw speed that matters.

I’ll let you into a secret: not all network cards are the same.

They might communicate at a line speed of 100Mb, but this does not mean that the computer can process data at that speed, and it does not mean it will pass through the switch at that speed. This is even more true at 1Gb.

This week at Infosec I’ve been looking at some 10Gb network cards that really can do the job – communicate at full speed without dropping packets and pre-sort the data so a multi-CPU box could make sense of it. They cost $10,000 each. They’re probably worth it.

Have you any idea what kind of network card came built in to the motherboard of your cheap-and-cheerful Dell? I thought not! But I bet it wasn’t the high-end type though.

Please generate and paste your ad code here. If left empty, the ad location will be highlighted on your blog pages with a reminder to enter your code. Mid-Post

The next thing you’ve got to worry about is the cable. There’s no point looking at the wires themselves or what the LAN card says it’s doing. You’ll never know. Testing a cable has the right wires on the right pins is not going to tell you what it’s going to do when you put data down it at high speeds. Unless the cable’s perfect its going to pick up interference to some extent; most likely from the wire running right next to it. But you’ll never know how much this is affecting performance. The wonder of modern networking means that errors on the line are corrected automatically without worrying the user about it. If 50% of your data gets corrupted and needs re-transmission, by the time you’ve waited for the error to be detected, the replacement requested, the intervening data to be put on hold and so on your 100Mb line could easily be clogged with 90% junk – but the line speed will still be saying 100Mb with minimal utilisation.

Testing network cables properly requires some really expensive equipment, and the only way around it is to have the cabling installed by someone who really knows what they’re doing with high-frequency cable to reduce the likelihood of trouble. If you can, hire some proper test gear anyway. What you don’t want to do is let an electrician wire it up for you in a simplistic way. They all think they can, but believe me, they can’t.

Next down the line is the network switch and this could be the biggest problem you’ve got. Switches sold to small business are designed to be ignored, and people ignore them. “Plug and Play”.

You’d be forgiven for thinking that there wasn’t much to a switch, but in reality it’s got a critical job, which it may or may not do very well in all circumstances. When it receives a packet (sequence of data, a message from one PC to another) on one of its ports it has to decide which port to send it out of to reach its intended destination. If it receives multiple packets on multiple ports it has handle them all at once. Or one at a time. Or give up and ask most of the senders to try again later.

What your switch is doing is probably a mystery, as most small businesses use unmanaged “intelligent” switches. A managed switch, on the other hand, lets you connect to it using a web browser and actually see what’s going on. You can also configure it to give more priority to certain ports, protect the network from “packet storms” caused by accident or malicious software and generally debug poorly performing networks. This isn’t intended to be a tutorial on managed switches; just take it from me that in the right hands they can be used to help the situation a lot.

Unfortunately, managed switches cost a lot more than the standard variety. But they’re intended for the big boys to play with, and consequently they tend to switch more simultaneous packets and stand up to heavier loads.

Several weeks back I upgraded the site with the most problems from good quality standard switches to some nice expensive managed ones, and guess what? It’s made a big difference. My idea was partly to use the switch to snoop on the traffic and figure out what was going on, but as a bonus it appears to have improved performance, and most importantly, reliability considerably too.

If you’re going to try this, connect the server directly to the switch at 1Gb. It doesn’t appear to make a great deal of difference whether the client PCs are 100Mb or 1Gb, possibly due to the cheapo network interfaces they have, but if you have multiple clients connected to the switch at 100Mb they can all simultaneously access the server down the 1Gb pipe at full speed (to them).

This is a long way from a solution, and it’s hardly been conclusively tested, but the extra reliability and resilience of the network has, at least allow a Sage system to run without crashing and corrupting data all the time.

If you’re using reasonably okay workstations and a file server, my advice (at present) is to look at the switch first, before spending money on anything else.

Then there’s the nuclear option, which actually works. Don’t bother trying to run the reports in Sage itself. Instead dump the data to a proper database and use Crystal Reports (or the generator of your choice) to produce them. I know someone who was tearing their hair out because a Sage report took three hours to run; the same report took less than five minutes using Crustal Reports. The strategy is to dump the data overnight and knock yourself out running reports the following day. Okay, the data may be a day old but if it’s taking most of the day to run the report on the last data, what have you really lost?

I’d be really interested to hear how other people get on.

Sage 2011? Line 50 with a proper database

Today I ran in to my “old friends” Sage at a computer show; they didn’t recognise me and tried to interest me in Sage Accounting for my business. I was wearing a suit, I suppose. As you can imagine, it didn’t take long for them to catch on, after which I turned the subject to the subject of Line 50 using a proper database.

You might have got the impression I really don’t like Sage. That’s not strictly true; the issue is that I really don’t like the idea of Line 50 being sold to SMEs planning to use it for anything non-trivial. Interestingly, the people from Sage agree – at least in private. The database driven version to cure the problemhas been promised for four years, so they’re obviously aware of the issue!

So when’s it coming? Apparently in Sage 2011, due out in August 2010. “Really?” I said. “Yes, definitely. At least that’s the plan”, they said.

I pushed a bit further – would it be using mySQL as promised or would they wimp-out and use the lightweight Microsoft server. I didn’t get an answer, which confirms my fears, but even a Microsoft SQL server is better that the current arrangement.

I tried to discuss the performance issues for people upgrading to Line 50 Version 2010 with them, but I got the impression they were a bit jaded on the subject, and did a very poor job of feigning surprise.

Why is Sage Line 50 so slow?

NB. If you want to know how to make Sage run faster click here for later posts, and read the comments below (there are a lot!).

As regular readers will know, I don’t think much of Sage accounting software, especially Sage Line 50. It’s fatally flawed because it stores its data in disk files, shared across a network using a file server. I suspect these.DTA files are pretty much unchanged since Graham Wylie’s original effort running under CP/M on an Amstrad PCW. There is continual talk that the newer versions will use a proper database, indeed in 2006 they announced a deal to work with mySQL. But the world has been been waiting for the upgrade ever since. It’s always coming in “next year’s” release but “next year” never comes.The latest (as of December 2009) is that they’re ‘testing’ a database version with some customers and it might come out in version seventeen.

In fact it’s in Sage’s interests to keep Line 50 running slower than a slug in treacle. Line 50 is the cheap end of the range – if it ran at a decent speed over a network, multi-user, people wouldn’t buy the expensive Line 200 (aka MMS). The snag is that Line 50 is sold to small companies that do need more than one or two concurrent users and do have a significant number of transactions a day.

So why is Line 50 so slow? The problem with Sage’s strategy of storing data in shared files is that when you have multiple users the files are opened/locked/read/written by multiple users across a network at the same time. It stands to reason. On a non-trivial set of books this will involve a good number of files, some of them very large. Networks are comparatively slow compared to local disks, and certainly not reliable, so you’re bound to end up with locked file conflicts and would be lucky if data wasn’t corrupted from time to time. As the file gets bigger and the number of users grows, the problem gets worse exponentially. The standard Sage solution seems to be to tell people their hardware in inadequate whenever timeouts occur. In a gross abuse of their consultancy position, some independent Sage vendors have been known to sell hapless lusers new high-powered servers, which does make the problem appear to go away. Until, of course, the file gets a bit bigger. Anyone who knows anything about networking will realise this straight away that this is a hopeless situation, but not those selling Sage – at least in public.

One Sage Solution Provider, realising that this system was always going to time-out in such circumstances, persuaded the MD of the company using it to generate all reports by sitting at the server console. To keep up the pretence this was a multi-user system, he even persuaded them to install it on a Windows Terminal Server machine so more than one person could use it by means of a remote session.

If that weren’t bad enough, apparently it didn’t even work when sitting at the console, and they’ve advised the customers to get a faster router. I’m not kidding – this really did happen.

The fact is that Sage Line 50 does not run well over a network due to a fundamental design flaw. It’s fine if it’s basically single-user on one machine, and I have clients using it this way. If you want to run multiple users, especially if your books are non-trivial, you need Sage 200/MMS – or a different accounting package altogether.