If it's anything like the way specialist science-lab IT works, I can definitely see that. There's plenty of chemistry labs with ancient OSs on some machines because they have some piece of equipment that doesn't have drivers for anything newer. Basically a disconnect between lifecycles: a good piece of chemistry equipment is supposed to last 20+ years, especially if it's doing something relatively routine and well-understood, whereas software seems not to usually be intended to stick around for 20 years.
> Basically a disconnect between lifecycles: a good piece of chemistry equipment is supposed to last 20+ years, especially if it's doing something relatively routine and well-understood, whereas software seems not to usually be intended to stick around for 20 years.
This is a good observation, and would probably make a great topic for a much larger discussion/essay.
Consistency is huge. It's amazing what some of those guys will get out of their equipment. I spent some time talking with a guy who builds data acquisition equipment. Most of his customers started using his sensors in the 80s and until the system either breaks beyond repair or the entire team of scientists has retired, they want to keep the exact same setup. Amazing dedication, but the benefits of testing with the same equipment over many years can lead to some great observations over the long term.
Last year I worked for a company that runs a huge (hundred of thousands of lines) Ms Access 97 software.
The migration to Access 2003 needs lot of changes and (worse) they don't have a version control tool, schema of the application, any documentation... The code is a mess and written by bad programmers.
So what to do? Start programming for Office 2007 from scratch? This decision took them around 2 years after failing to integrate their access tables with Microsoft SQL (The speed is insanely slow for large corporations).
I need to find companies/projects like these. I would have loved to solve this problem in one way or another.
a) Rewrite as a web-app in platform of choice.
b) Put the entire DB in RAM on a really fast Windows server using SuperVolume. I'm guessing the MDB is less than 16GB in size. Setup Citrix/XenApp on it and deploy MS Access 97 as a standalone Xen application. Users only need to log on to http://citrix.company.com and click the "Access 97" icon. It will launch Access as a local app but it actually runs on the server. You get local disk speeds on the DB with minimal network usage. XenApp licenses are very cheap compared to even new Office licenses, let alone a full rewrite. If users do not need to change the Access 97 code/queries, you can deploy Access runtime instead of the full-access.
c) I'm guessing they already tried to move the tables to MS SQL while just linking it from Access 97. If that isn't possible because the code uses tons of Access specific queries, mirror every table to MS SQL periodically, depending on the frequency of updates. Convert the slowest read-only reports to use the data from SQL while data-entry continues to use Access 97 tables.
d) Break up the single Access DB into multiple DBs if possible. If Finance and Warehouse modules share almost nothing in common, they don't need to be on the same DB. If the 'employee' table is used to authenticate logins, put that table on MS SQL and link it from both Finance.mdb and Warehouse.mdb.
e) Implement a decent ERP system if this application is really core to the company's operation. MS Dynamics NAV would probably be their best bet.
Thanks for your insightful comment. Here are my comments on your points.
a) They have currently started doing it.
b) Companies don't need it. They aren't really bothered with Access 97 and Windows 2000 or XP. All they need is our application. Low OS/Application means low and cheap hardware.
c) Tried the simple linking in MS Access, no improvement in speed. We need to re-write all the queries from scratch. Many problems showed and since the code base is huge, rewriting from scratch seemed more feasible and easier.
d) The application is huge, it has dozen of DBs
e) I don't know what you are talking about, but will do a little research.
Those were just five different ways you could have gone. (a) is what I would have suggested too. I'm going to guess the main reason for the speed issues is bad queries (m*n joins, no/few indexes etc). The best solution is often a rewrite in these cases.
What I meant by (b) was to put the entire database in RAM and let users access it by logging on to the server directly. You can put the MDB files on a file-server network share (like P:\Database\Company.mdb) and 10 people can use it at the same time as long as they all have Access 97 on their local machines and have read/write access to the P:\Database folder. Instead of doing it this way, try this for super speed: setup a fast Windows server with tons of RAM, create/format a 2GB-16GB partition as drive D:, install SuperVolume to load that entire drive D: to RAM. Move the MDB files here.
Don't make users open the MDB files from their PCs directly. Let them log on to a Windows Server as Terminal Server (TS) users and access it from the D: drive. If you have some money, install Citrix so logging on to the TS could be done by just visiting a website and logging in. This would let people from all over the world use the Access MDB at exactly the same speed as anyone physically logged into the server. With Citrix, you'll only need 20-32kb of bandwidth for most users.
Users working over the same database crash it. It seems that when users send requests in the same time, the DB crash (it can be fixed, but that's annoying). It doesn't always happen, but when you have more than 10 users, this increases the odds.
So the short term solution was Terminal Server. It worked great and speed was fine, until in some days something happened in the Windows Server that slowed it down (some kind of caching or somewhat), so the illiterate technician just said that Windows Server is bad.
We are now (Ok, I left) reckoning on an advanced/new Web or Desktop platform. They are aiming to finish a part in the software this year. Will see what they do ;)
Windows 95 is still used extensively in embedded programming due to the unparalleled direct hardware access that it provides. When you are developing code for devices that use 8086, or even older FPGA's, you don't need the latest OS. In fact, the newer versions of Windows have more security and device layers which block the operation of the older programs used for uploading code to those older microprocessors.
As a result, some of the old chipsets can only be programmed using old software which only works on 95. For a business that does embedded programming it isn't feasible to switch to a new chipset that would require a complete code rewrite just so that they can use a newer OS.
Interestingly embedded programming is also an extremely high paying job because there aren't that many people who can do it.
Why wouldn't you just use Linux for embedded systems? Seems wasteful to use a GUI-based OS for something that will most likely be a single-purpose device.
Toolchain and development cycle. Windows 95 systems programming is trivial, and not in a good way. You can access privileged system registers and memory areas directly, until you crash your box. Once you learn how to write a VxD it's a matter of tweaking it for other purposes, since the great difficulty lies in just getting the skeleton to work.
Linux is mostly that easy, in my experience. It may be easier, since there are probably already drivers for most of your chips. At any rate, once you have a skeleton kernel mode driver, it's just a matter of tweaking it for other purposes.
How many point-of-sale systems have you seen that run on Windows 3.1 or old curses-ish UIs? I'm sure it's even more common out of public view: in airport ticket kiosks, warehouse inventory systems, and the like. I saw my share of anachronistic systems working for a Midwestern public library system.
DOS would be less surprising to me th an Windows 3.1 or 95, because DOS is still remotely useful in comparison to the latest MS software. It provides simple, direct access to the hardware with no scheduler in the way, so it's probably great for hackjob embedded systems.
I remember it working with 3.1 even, although it wasn't built in (needed something called Trumpet Winsock). '98se had connection sharing (I thought I remembered that '95 did too, but Wikipedia says no... maybe it's that it came with something to install on '95 machines that would let them be clients? I know for a while we had an XP machine with the modem, and a '95 machine that could see the internet through it.), which did for your dial-up much the same thing that your home router does now for your cable/dsl.
I apologize for being a dick. Your question was about the internet, which did work with Windows 95, and so did the web, in fact, the whole dotcom era was powered by Windows 95 more than any other OS.
Windows 95 had a native implementation of sockets, Winsock2, it also had telnet, and a serial communication utility. I ran Opera, IE and Netscape Navigator Gold on it. Eudora for Email, and Agent for a newsreader. Windows 98 shipped with Internet Explorer and had Active Desktop, the precursor to all the gadgets that you see today. The Windows SDK included even more controls, including MS HTML control that allowed you to embed web views (i.e. html frames) inside your apps. All MS help was converted HTML format as well, and the help compiler could convert HTML documents to CHM. It was, only, 10 years ago :-)
I still use the email account I opened using Windows 95 :-)
OK, the support ones make some sense, but I'm baffled by the one that wants experience with Matlab and Windows 95. Who uses Matlab under Windows 95 today? A scientific organization that hasn't been able to buy new hardware for more than a decade?
Assuming it's on the same machine: perhaps they have an instrument which runs on Windows 95 and they do some light data processing, or use Matlab to make it interoperate with another device?
We've had a similar need in my lab, although it was Windows 2000. We buy new machines alright, but some instruments can't be easily upgraded due to the computer being integrated with it, the drivers being proprietary and specific to the OS, etc.
I have a client still running an old NT box because it's the newest computer that can run a $100k+ piece of equipment. The piece of equipment it is attached to is in excellent shape and is still considered state of the art.
Perhaps this is an important message to companies attempting to create rapid innovation in the health sector...