LIS System Configuration and Edge Computing
LIS System Configuration: Client/Server or SaaS/Cloud, Which is Best? Why Not Make the Same Decision Google Made?
If you’re considering different LIS configurations, you’re probably considering Client/Server and Software-as-a-Service (SaaS)/Cloud configurations. Client/Server (C/S) systems have been the de facto configuration for many years, while SaaS solutions are becoming more popular as applications migrate to the cloud. But there’s another choice to consider: Edge Computing. Edge Computing is the choice that Google, Amazon, eBay and other leading tech companies made.
First, let’s step back and take a look at the three things you need your LIS to do regardless of configuration:
- Always be available to perform work
- Never have data lost or corrupted by downtime
- Manage lab tests with a reasonable budget
Why Are These Points Important?
When the server goes down in a Client/Server topology – and it will – two things happen. First, you can’t do your work, and your lab is for all intents and purposes closed for business until the server is up again. Second, in-process data gets lost or corrupted, or worse, you’re not sure what data was created and where the transmission to the server stopped. This means a lot of time and effort will need to be taken to review all data, lab tests, etc. to ensure tests were done and rebuild data that was lost or corrupted.
Of course you can throw money and resources at the network in terms of hardware and personnel to reduce the probability of the server going down. But there’s no guarantee this will work. Worse, this can get very expensive, and you still can’t be certain the server will be up 100% of the time.
Many companies are turning to the cloud and SaaS LIS solutions. With these, you connect via the Internet to an LIS vendor-hosted system that has spent the money and has the personnel to keep the probability of the system going down to a minimum. You are charged a reasonably low fee to have your patient lab data managed on it. Sounds good, huh? Not so fast.
With SaaS solutions, much of the risk rests with the Internet. Every time your Internet goes down, you face basically the same problems as you do when your server goes down: no ability to process data, loss of data, and corrupted data. Back to square one. Further, with SaaS/Cloud solution, who owns the data can be an issue. Once you send your data to their server, technically they likely own it and therefore control it, and they control your ability to retrieve it.
There’s a third option to consider. Let me tell you a little story: Google made a corporate decision to be available 100% of the time to any part of the world where the Internet is running. They certainly have the resources and personnel to create the best system possible. However they realized the risks of Client/Server and SaaS downtime existed no matter what they threw at it and chose an advanced third option: Edge Computing.
Edge Computing pushes applications, data and computing power (services) away from centralized points to the edge, or logical extremes of a network. It is a form of computing on which most IT professionals are only just discovering and has several advantages:
- Edge Computing eliminates or de-emphasizes the core computing environment, limiting or removing a major bottleneck and a potential point of failure. If one node goes down, the other nodes continue to work.
- The system is self-healing. When a down node is back up, all data is fully restored.
- It is self-distributing. One update on a node automatically distributes to all the other nodes. No running around and installing thick or thin clients on multiple terminals.
- It is the lowest cost solution in terms of both capital and personnel.
- Edge application services significantly decrease the data volume that must be moved, the consequent traffic, and the distance the data must go, thereby reducing transmission costs, shrinking latency and improving quality of service.
- Security is also better than what’s achievable via C/S or SaaS, as encrypted data moves further in toward the network core. As it approaches the enterprise, the data is checked as it passes through protected firewalls and other security points, where viruses, compromised data and active hackers can be caught early on.
What Does This Mean for the Clinical Lab?
By pushing LIS data away from the core computing environment, it doesn’t matter if a node goes down: the LIS continues to be available and the lab can continue to process patient tests and data. Even better, when the down node is back up, all data is fully restored. This means no lost data, no corrupted data, no concerns whether data is compromised or inaccurate.
If Google – who has the resources for any kind of system configuration – chose Edge Computing, and since Edge Computing is extremely affordable, maybe it is the right solution for you. If Google is assured of zero downtime at the lowest cost, why not you?
Comp Pro Med’s Polytech LIS is built on Edge Computing. We know a good thing when we see it. And, we’re adding new functionality and capabilities every month to make Polytech even better. Find out more: Set up a no-obligation web demo to see the differences you’ll get with a LIS powered by Edge Computing.