IT Support and Hardware for Clinics
31.3K views | +4 today
Follow
IT Support and Hardware for Clinics
News, Information and Updates on Hardware and IT Tools to help improve your Medical practice
Your new post is loading...
Your new post is loading...
Scoop.it!

Servers In Medical Centres

Servers In Medical Centres | IT Support and Hardware for Clinics | Scoop.it

Servers in medical centres are a common aspect if IT support & maintenance. Here’s a tip for Healthcare IT Support.

 

Some use a PC which is configured to act as a server, other healthcare organisations would have implemented a professional business grade server to store their medical applications and finally, some use cloud based server which are essentially virtual servers.

 

At some stage in the business’s lifecycle, the network or IT environment begins to slow down, become slightly unreliable and you hear more and more frustrations from the staff.

 

The typical knee jerk reaction will always be to reinvest into a new server and replace the old one. This can be an expensive, complex and frustrating exercise if its not completed by a dedicated healthcare IT professional.

 

In this blog we wanted to share something a little different; some strategies which will make your network more reliable, increase the lifecycle of your server and finally, help you save your money.

Our hot tips are:

 

Monthly Server Maintenance
By far the easiest and most important task. The monthly server maintenance can be implemented by your IT provider (or if you are interested, email us and we will give you the steps on how to do it).

The monthly maintenance includes installing all the new server operating system updates, the clinical software updates, updating your antivirus and third party software.

 

Other tasks include checking your disk space and removing any temporary files, rebooting the server and finally, deleting any unnecessary files in the downloads or documents folder.

This activity will ensure that your server is up to date and the reboot will run all the required servers correctly.

 

Upgrade Your Firmware Every 6 Months
We highly recommend you engage an IT professional for this activity. According to the world’s leading technology vendors, over 90% of hardware reliability issues are due to the lack of updating the machine’s firmware.

 

Firmware is a software product which managed the hardware of your server and it effects the way it behaves. By upgrading the firmware of your server, you are installing the latest updates, fixes and patches which directly relate to your server.

 

Some benefits of firmware updates include a faster server, less over heating, less server lockups and most importantly, a longer lifecycle.

 

Add More RAM & Hard Disk Space
When you purchased your server it would’ve had little load on it. Your staff numbers were limited and back then it didn’t have to support new updates.

 

As your clinic grows and the network requirements become more, your server will begin to feel the load. Its memory is now at full capacity and its working as hard as possible.

 

By upgrading the RAM and hard disk space (if you are not sure how to do it, contact your IT provider OR US), you are essentially giving your server more resources to handle the extra load.

This upgrade usually costs about 15% the price of a new server and in turn, saving you more money and giving your network more firepower.

 

Manage Your Backups Correctly
Running a backup is one of the most memory heavy tasks a server can do. A backup can take up all the memory and CPU power. Our strategy is to always ensure that the backup of your server and clinical data is running outside business hours. This way you won’t feel the network on the server.

 

Implement The Right Configurations
A very open term I know however configuring the server in the right way does play a big part on how it behaves when processing data and ensuring that your clinic staff can access their medical applications.

 

A simple example would be implementing an Active Directory role (technical I know however this is important). If your server is setup as an active directory then it can manage and facilitate how the users access the data in a more efficient way.

 

Another recommendation would be to setup your server as DHCP and DNS. This way when you access your clinical applications (Medical Director, Genie, Best Practice etc..) then the network computers can quickly find the server and locate the clinical database.

 

Check The Firewall Settings
Most of the connection issues (speed, reliability) relate to 3 core aspects, the quality of the connection between the computer and the server, the way the server is configured and finally, the way the firewall is configured.

 

Depending on which firewall solution you have in place, it needs to be configured correctly so that it allows undisturbed access to the clinical applications from the clinic’s computer.

 

If the firewall is not configured correctly then you will notice that the network will be slow, so will the server and finally, so will the clinical applications.

Technical Dr. Inc.'s insight:
Contact Details :

inquiry@technicaldr.com or 877-910-0004
www.technicaldr.com

more...
No comment yet.
Scoop.it!

Cloud Backup Solutions 101- A Primer for Healthcare Organisations

Cloud Backup Solutions 101- A Primer for Healthcare Organisations | IT Support and Hardware for Clinics | Scoop.it

Some businesses rely on onsite backups, whether in the form of external hard drives or perhaps tape or storage media such as DVDs or DVD-RAM, all of which are subject to failure. Hard drives typically have a life span of three to five years and even high-grade disc-based media is easily damaged by careless handling or incorrect storage (near a magnetic source, for example).

Legislation and E-health Driving Change

With the introduction of electronic medical records and legislation on data privacy, businesses are legally obligated to secure their clients billing, medical and personally identifiable information (PII). Many companies have a disaster recovery plan that includes an offsite data backup solution. For convenience, this primarily takes place in the cloud, as the process of storing onsite backups in a fireproof safe or manually transporting backups to another location is widely considered obsolete.

Business Continuity?

Whether your business network is on-premise only, already in the cloud or a mix of the two (typically known as hybrid IT), business continuity is the aim and most organisations seek to include a solution that allows staff to continue working, even if the power or broadband service is down. When your business processes are in the cloud, restoring from backups is easy and business continuity is assured. Likely, your clients will not even know that there is a problem with your on-premise network as normal service is uninterrupted. Cloud service providers have several redundancy options in place so cloud services are rarely impacted by hardware failure.

Moving to the Cloud

If your business does not have an automated backup solution in place, it is certainly worth considering, as onsite hardware failure can jeopardise your business’s reputation, even if just a few hour’s data is lost. When a hard drive fails, specialist recovery is possible but is expensive and requires specialist knowledge and equipment. When data protection is the aim, an automated and real-time backup offsite is the only failsafe solution and use of the cloud ensures local disasters (whether hardware, fire or water damage, or extreme weather conditions) have no impact on your business data.

Cloud Provider Selection

All cloud providers are not created equal and like any other industry, service quality varies as does administration access. Ideally, your cloud service provider understands healthcare processes and the importance of immediate access to data in a clinical environment. Professionals in this area will offer a customised solution to fulfil all your backup and restoration requirements. This solution should include but is not limited to:

Onsite analysis of your existing broadband solution—Your broadband may well be adequate for general business use but when backup schedules are involved (even if daily backups are scheduled outside business hours, you cannot afford to miss a backup due to broadband outage. Possible service provider recommendations could include an additional broadband connection, dedicated line or provision of a router that offers a 4G SIM redundancy option.

 

Backup method and process­—The way you backup can determine the success of the solution. The speed of the process is determined by the speed of the broadband connection.

Data Storage—Data must be stored in a location that complies with state laws. For example, selecting a provider with U.S. servers is not compliant.

Remote access—Can the backup be performed remotely if needed? Can the resulting backup be accessed and verified remotely?

Auditing—Once a backup is performed, it needs to be verified as good. Many companies have found that unchecked backups are corrupt, failing when they need them the most.

Exit Clauses—Every customer has a right to change service providers if they wish. Verify that your potential providers offer the facility to migrate your data to a new provider easily and that it is very clear who own the data involved.

 

Disaster Recovery Plan

Auditing and indeed backups themselves are a key part of any disaster recovery plan. To ensure business continuity and comply with governing regulations and industry standards, healthcare organisations are responsible for the storage, backup and security of their data.

 

Fortunately, cloud service providers are held to a higher standard than typical businesses and their infrastructure must incorporate redundancy options, security and backup processes that are very costly for smaller companies to implement.

 

In conclusion, from a cost perspective, it makes sense for healthcare organisations to use the cloud for backup, storage and security. In doing so, business owners can relax, secure in the knowledge that real-time automated backups of all data are carried out in a secure manner. All that is really needed to ensure business continuity in a cloud environment is remote access using an internet-enabled device. AND ensuring the internet is present is easily achieved by adding an on-premise router to the network, with redundant connections to a 4G mobile network. If you haven’t already, can your business afford not to automate data backups in the cloud?

Technical Dr. Inc.'s insight:
Contact Details :

inquiry@technicaldr.com or 877-910-0004
www.technicaldr.com

more...
No comment yet.
Scoop.it!

Microsoft offers Bing Rewards users 100GB of free OneDrive cloud storage

Microsoft offers Bing Rewards users 100GB of free OneDrive cloud storage | IT Support and Hardware for Clinics | Scoop.it

The entire purpose of Bing Rewards is to—surprise!—reward you for using Bing. Usually, that entails earning points by conducting web searches and cashing in those points for gift cards and other goodies. But you don’t have to conduct a single Bing search to claim the latest juicy offering: Through February 28, any Bing Rewards member can claim 100GB of OneDrive cloud storage for absolutely free for two years—no searches or rewards points required.

The offer won’t entice Office 365 subscribers, who already enjoy an endless bounty of storage space in the sky, but this is a don’t-miss offer for any other OneDrive user. OneDrive's free tier offers only 15GB of data, and 100GB plans typically cost $2 per month.


There are surprisingly few strings: You’ll obviously need a Microsoft account to claim the 100GB, but if you’re a Bing Rewards member you already have one. Accepting the offer gives OneDrive the right to send you promotional emails, though you may cancel them at any time. And as mentioned, the freebie storage disappears two years to the day after you claim it. Microsoft obviously hopes you’ll fill up the space and pay to keep using the extra gigs after the deadline, but be sure to read up on what happens when your cloud storage dissipates just in case.

Head over to your Bing Rewards dashboard and look for the “Free storage” header next to a OneDrive logo to get in on the action.

The story behind the story: As Microsoft shifts to embrace services more and more, expect to see more hooks like this—the use of one service to entice you into using other Microsoft products and the greater Microsoft ecosystem. Now that Windows 8.1 with Bing (with Bing as the default search provider) is rolling out in low-cost devices that are often constrained storage-wise, using Bing Rewards to dangle gobs of OneDrive storage is a natural promotion. And once you’re signed up with Bing Rewards, you’re more likely to start using Microsoft’s search engine to claim more freebies, of course.

But who cares about all that? The important part is the free 100GB of cloud storage. Go get some!


more...
No comment yet.
Scoop.it!

What Is "the Cloud" — and Where Is It?

What Is "the Cloud" — and Where Is It? | IT Support and Hardware for Clinics | Scoop.it
There's at least one funny joke in Sex Tape. While frantically trying to cut off access to the amateur porn vid he accidentally uploaded to iCloud, Jason Segel tries to explain why deleting the file won't work. "Nobody understands the cloud," he says. "It's a fucking mystery!" He's kind of right.

"Cloud" is a buzzword that vaguely suggests the promise and convenience of being able to access files from anywhere. But the reality is that the cloud is hardly floating like mist above our heads — it's a physical infrastructure, its many computers housed in massive warehouses all over the world. And yet as long as it's easy read email on our phones and watch movies on our laptops, we generally don't take the time to wonder where our data actually goes, how it gets there, and what happens to on its way.

What is actually happening when you punt your files, photos, and videos up to servers owned by Apple, Google, and Amazon? Let's peek behind the cloud, and face reality.
Origins of Cloud Computing

While the term "cloud computing" has only entered the public's lexicon in the past 10 years or so, the idea's been around for decades. Cloud computing basically refers to a process of sharing resources to optimize performance. Practically speaking, that means using a network of computers to store and process information, rather than a single machine.

The early days of computing actually leaned heavily on a pretty similar concept. Back in the 1950s, when computer mainframes were the size of a room, users would log on to a dumb terminal to take advantage of the machine's processing power. (They're called dumb terminals because they can't really do much of anything without the mainframe.) This time-sharing model is pretty analogous to the way cloud computing works on the internet today. But instead of one massive mainframe in the middle of a room, we rely on a global infrastructure of servers and data centers to do the heavy lifting.

By the time the 90s rolled around, it was pretty clear to the cyber-prophets of days gone by that the future would enable the whole world to share resources. Engineers started using a drawing of a cloud to refer to this network in patent drawings in the mid-90s. Compaq engineers coined the term "cloud computing" in late 1996, and less than a year later, Steve Jobs described a proto-iCloud at WWDC:

It was pretty revolutionary at the time. You store your files one place and you can access them from any device. Fast forward to the iPhone era, and it's easy to forget the dark ages, when you actually had to burn CDs and tote around external hard drives. Now you start watching a movie on your laptop, switch to your tablet, and finish it on your phone without missing a scene.

Let's back up for a second, though. The idea of cloud computing is almost metaphysical. In more practical terms, however, the applications of cloud computing tend to revolve around one key feature: storage.
Life Without a Hard Drive

A wonderful thing happened about a decade ago. Thanks to a confluence of factors, lots of computers started getting persistent, high speed internet connections. Not long thereafter, mobile devices started getting the same thing. So if devices are always online, and data transfer speeds aren't abysmal, why not just store all the software and storage online?

That's essentially where we're headed with the 21st century notion of cloud computing. Cloud computing means that your laptop works less like a standalone computer and more like a dumb terminal. Ever used a Chromebook?

From a technical point of view, leaning on the larger network of computers in the cloud makes great sense. Suddenly, you don't need to worry as much about hardware specifications, like RAM or hard drive space, because the network can do the heavy lifting.

Distributing the load across lots of powerful servers means web-based applications can run more dependably and efficiently. These servers are constantly updating, and those web apps more or less always work. If one server crashes, there are others to pick up the slack. Your IT department at work probably loves this idea.

Those are the broad strokes of cloud computing. What people sometimes blindly refer to as "the cloud" is something a little bit different.
The Truth About "the Cloud"

Cloud computing is wildly popular at the enterprise level, where IT managers are focused on maintaining stable systems that are used by hundreds or thousands of employees. Most consumers encounter the cloud on an individual basis, however, with cloud storage. Where's that sex tape? It's in the cloud. But wait, what's the cloud? It is not a giant hard drive in the sky.

When you store something "in the cloud," you're actually storing it in a very physical space. That file slides across the wire and then lives on a physical server—usually more than one—in some far flung place. And depending on which cloud storage service you use, that file is now in the possession of a giant corporation to whom you probably pay a monthly fee. Anybody who's ever used Dropbox knows that this makes it incredibly convenient to access files or to share files from any computer with an internet connection.

In the past, you just bought a computer with a hard drive inside and stored your files there. Now, you pay a company like Apple or Google to store the file remotely and provide you with access when you ask for it.

If your data lives "in the cloud," it actually lives on a company's server, and you more or less pay a membership fee to work in that company's sandbox. Depending on that company's terms of service, you may or may not actually own or control that data once it lives in cloud storage. This raises a few glaring concerns in terms of security and privacy.
Storms Ahead

The Sex Tape example is a terrific analogy for how helpless you can be once you've uploaded something to the cloud—terrible movie, terrific analogy. Once your data's in the cloud, you've lost some basic control over it. If you upload a file to a cloud storage service like iCloud, Google Drive, or Amazon CloudDrive, you're actually making copies of that file. The file likely lives on several servers in case Godzilla attacks one of the data center or something, so if you want to delete that file, you're trusting the company to delete all of the copies.

As we've seen in the past, this doesn't always happen like it's supposed to. So you're not really in control of your data if you're not in possession of it. You're just not.

Let's say the police want to have a look. Depending on its particular privacy policies, the company you picked for your cloud storage can actually hand over your data whenever the authorities ask them. Sometimes, the cops don't even need a warrant. Companies like Google publish transparency reports on a regular basis that show how many hundreds of times this happens every year.

So just keep that in mind next time you're uploading something to Google Drive instead of storing it locally. The cops would need a warrant to break down your door and go searching through your personal hard drive. The process of getting information from Google is somewhat more streamlined.

Once you're at the stage where you're uploading files to Apple's servers, you've already agreed to the company's terms of service. (By the way, those terms of service probably failed to clarify who actually owns the data in the cloud.) The shitty part about this concern is that you can't do much about it, except trust the company storing your data and hope nothing bad happens.

Granted, tracking down deleted files and worrying about warrantless police searches don't necessarily affect the average person on a daily basis. However, the concern that a hacker could get ahold of sensitive information should be. Look no further than the catastrophic iCloud hack to understand how this is a very real concern.

What you can do is encrypt data before you upload it to the cloud. Here's how.
What's Next

The cloud is convenient. That fantasy that Steve Jobs described in 1997 is now a reality for a lot of people, and that's awesome. The cloud so awesome that the world's biggest technology companies are scrambling to find out how to make the most money they can off of it.

For now, the monthly fees you pay for cloud storage are comparable to what you'd pay for an external hard drive back in the day. The advantage is that you can access the data from anywhere and never have to worry about the data disappearing—probably. The disadvantage is that you don't have as much control over the data and never really know what's being done with it, and could be hard-pressed to make it disappear if you want it to go away.

Google was already talking about how to put advertising on the cloud nearly a decade ago. The dystopian future in which you'd have to watch pre-roll ads just to update your resume is not as dystopian as you might think.

Cloud storage is just one aspect of cloud computing, though. While the promise of this very 21st century technology is exciting, the reality of living in a world where we all carry around dumb terminals and depend on a for-profit entity to manage our data is sobering. This doesn't mean you should use iCloud or Google Drive or Dropbox or OneDrive or CloudDrive. It just means you should know what you're really doing when you're using them.

The cloud isn't magic. It's a business.
more...
No comment yet.
Scoop.it!

Three Tips For Password Security That Actually Work - HITECH AnswersHITECH Answers

Three Tips For Password Security That Actually Work - HITECH AnswersHITECH Answers | IT Support and Hardware for Clinics | Scoop.it

Someone once told me that developing a usable and secure password management system isn’t rocket science…it’s much more difficult than that. Naturally, I disagree as I have witnessed numerous implementations of password management solutions that were a major success in a very short period of time. Plus, “success” of these implementations can be measured financially, through improved operations and through improved security.

An organizational password management implementation involves a number of key elements consisting of a blend of technology and internal business processes including:

  • the use and misuse of multiple passwords
  • composing hard-to-guess passwords
  • changing and reusing passwords
  • the art and science of keeping passwords secret
  • intruder detection and lockout
  • encrypting passwords in storage and transit
  • synchronizing passwords and the latest in single sign-on
  • user authentication for self-service capabilities
  • IT support for forgotten and locked out passwords.

However, introducing password management best practices is not a daunting task, and I am certain almost every organization has the main concepts already defined (although possibly not matured). Here are three tips to help in your management.

Tip #1: Multiple Passwords Can Be Inhumane

The problem with passwords in a large enterprise is that people generally require so many different accounts and corresponding passwords to access the expansive list of both cloud and on-premise systems and applications, that sometimes it feels humanly impossible to remember them all. And just about the time you feel you have them all memorized, they then need to be changed. So what is the natural reaction of a worker who needs to efficiently accomplish all their tasks across a number of different systems? They start to develop a host of insecure behaviors around password management including:

    • writing passwords down and supporting 3M PostIt Notes sales
    • using passwords that are simple and easily compromised
    • contacting the Help Desk constantly when they forget their password (contributing to 30 percent of All Help Desk calls)
    • reusing old passwords as often as possible

These behaviors creep into the workplace because workers want to avoid downtime and the hassles that go along with it.  The solution to the entire password management problem incorporates three critical components: an easy self-service password reset capability to ensure people can reset their own passwords, a synchronization solution that changes passwords across all of a user’s systems and a single sign-on solution to limit the number of sign-ons required.

Tip #2: Compose Passwords That Are Difficult To Crack

All it takes to understand the glaring issue of password strength is to see the 25 worst passwords and their current ranking based on use (thanks to Splashdata who measures them):

1. 123456 (up 1 and taking the top spot from “password” for the first time
2. password (down 1)
3. 12345678 (unchanged)
4. qwerty (up 1)
5. abc123 (down 1)
6. 123456789 (new)
7. 111111 (up 2)
8. 1234567 (up 5)
9. iloveyou (up 2)
10. adobe123 (new)
11. 123123 (up 5)
12. Admin (new…you know who you are…)
13. 1234567890 (new)
14. letmein (down 7)
15. photoshop (new)
16. 1234 (new)
17. monkey (down 11)
18. shadow (unchanged)
19. sunshine (unchanged)
20. 12345 (new)
21. password1 (up 4)
22. princess (new)
23. azerty (new)
24. trustno1(down 12)
25. 000000 (new)

But hey, at least “password” is no longer #1!  The solution to this overly simple problem:  prevent your users from being able to use simple, easy-to-guess passwords!  Controls around password strength have been around for a long time, and most software and operating systems provide a way to prevent weak passwords from being used if configured correctly.  Unfortunately, some organizational legacy system baggage prevents setting stringent controls holistically at the target system, so software solutions have been created to help enforce password policies and prevent poor password decisions at the time the password is set and then synchronized across systems.

Tip #3: Change every password but the kitchen sync.

Password synchronization can solve so many issues around password management, so I am amazed when organizations choose a password management solution that only changes the core Active Directory or LDAP password without being able to sync to all the other systems a worker uses on a regular basis. Syncing passwords ensures users only need to remember one core password when logging into corporate systems, and this ultimately helps prevent the problem of workers writing down their passwords. It also helps solve the password expiration problem since the passwords will all be changed at the same time.

The latest solutions can map usernames across systems and still sync passwords successfully. For instance, my AD account may be RYANW, but my AIX Unix password is WARDR. The password management solution keeps track of those mappings and automatically knows to change my password for both AD\RYANW and AIX\WARDR. Synchronization can now also work with cloud-based applications such as Salesforce.com, Google or Office365, so security is strengthened by regularly changing cloud-based applications that in the past were typically left unchanged or had longer expiration windows.

Hopefully, you will find these tips easy to implement. In my experience both in-house and as a member of an IT Consulting firm, these simple additions, if you are not already employing them, will go a long way in keeping your passwords secure and your chances of a breach considerably lower.



more...
No comment yet.
Scoop.it!

What you need to know about the cloud

What you need to know about the cloud | IT Support and Hardware for Clinics | Scoop.it

The cloud is the Internet, when used for chores that are traditionally handled by local hardware and software. For instance, if you back up your files to an external hard drive, that’s local. But if you use an online service such as Mozy or Carbonite, you’re using cloud-based backup.

Another example: If you use the installed Outlook program to read email, you’re using the Internet, but not the cloud. But if you read your email on the Outlook.com webpage, you’re reading it in the cloud.


Other cloud-based applications include storage/sync services such as Dropbox, and web-based office alternatives like Google Docs.

The word cloud suggests something that’s not quite real, not quite solid. And while cloud computing can feel that way, that’s not quite true. The software and data are stored on a server somewhere—probably multiple servers in various places.

But the very nature of the cloud brings problems. Speed, for instance. Almost any sort of local data connection—ethernet, Wi-Fi, USB 2.0—is going to be faster than most home Internet connections, especially when you’re uploading (the exception, of course, is if you’re lucky enough to have fiber to the home). A backup that would take minutes to an external drive could take hours online.

And then there’s price. Many cloud services are free, but only if you accept severe limitations. For instance, the free version of MozyHome will back up only 2GB of data. Upgrading to 125GB will cost you $120 a year. Compare that to the cost of an external drive: $50-$80 for 1TB, and you only have to pay that once.

Worst of all, you have to worry about security and privacy. Unless you’re encrypting everything on your end before it uploads, you have to assume the possibility that the company providing the service, the government, and illegal hackers could be accessing your files.



more...
No comment yet.
Scoop.it!

Cloud Computing Supports Telemedicine Growth

Cloud Computing Supports Telemedicine Growth | IT Support and Hardware for Clinics | Scoop.it

Today’s healthcare professionals enjoy convenient access to a multitude of tools that would have amazed previous generations. Unfortunately, lack of awareness or access to technical experts means that many practices are unable or unwilling to take advantage of the latest technological advances, advances that increase efficiency, security and productivity. Others are intimidated by the technical jargon often associated with eHealth. All that is needed to eliminate all these issues is a partnership with a provider that specialises in the health industry, rather than deal with IT companies that are themselves unwilling to recommend healthcare-specific solutions that they are unfamiliar with

Providers of healthcare solutions are familiar with the inner workings of practices and clinics and can easily review existing processes and recommend solutions that will integrate technology in the best possible manner. They will also support any new technological solutions, leaving medical professionals more time for patient care, which will ultimately provide substantial benefits that aid early diagnosis and prolong lives.

 

Providers without healthcare knowledge will recommend solutions that they are familiar with, ones that are normally selected by traditional commercial enterprises. Such solutions are generally unsuitable for healthcare clinics and practices and often require expensive customisation, assuming that they can even be customised sufficiently to meet existing regulations.
Smaller clinics and practices do not have an on-site IT team and often eliminate IT requirements by automating server maintenance, data backups and archiving using a cloud solution. In such a scenario, it is the provider that is responsible for all of these activities.

 

Telemedicine allows easier collaboration and involves the use of mobile or other internet-enabled devices. Advantages include instant videoconferencing, remote consults, immediate access to electronic health records and the elimination of geographical issues, where patients are unable to visit the practice or clinic in person. These collaborative features are used between medical teams in multiple locations, between mobile clinics and their headquarters and of course can be used by any medical professional on the move.

 

When cloud services are used, connectivity is possible from anywhere a broadband signal is present, whether to a PC or portable device. This is ideal for patients in remote are rural areas and eliminates the time and expense necessary to consult with a specialist in the traditional manner. With videoconferencing, for example, no travel is required yet an excellent service is provided by the healthcare professional involved. Even follow-ups are possible online. Security concerns are also eliminated as a travelling professional accesses data remotely and never stores it on their own personal devices.

 

There are additional cloud benefits for healthcare professionals and these include:
• Scalability – you pay for the amount of space you use and it can be increased on demand
• Automatic updates – the provider’s IT team install security patches promptly
• Disaster recovery – automated regular backups take place and are restored when active data is lost
• Redundancy – multiple broadband connections are available. When one fails, another takes over
• Flexibility – if additional bandwidth is needed, it is readily available. This is not possible with traditional networks
• Works from anywhere – an internet-ready device, a 3G or broadband connection and you are good to go
• Collaboration – depending on requirements, there are specific software applications available to aid collaboration between team members and patients
• Document management – a single document repository allows secure and controlled access to confidential information
• Security – the use of the cloud ensures data is never stored in portable devices, given that thousands are lost or stolen every year
• Green-tech – the carbon footprint for each practice or clinic is substantially reduced, with cloud servers using less power per client due to virtualisation technology
• Cost savings – every clinic and practice uses the latest in hardware and software technology but without the initial investment. Ongoing IT maintenance costs are also dramatically reduced

When it comes to telemedicine, in addition to data management and document control features, there are software applications available that maximise patient turnover per clinic, improve patient care and even improve follow-up treatment and remote monitoring processes.

 

With benefits of this magnitude and with evolving regulations to embrace technological advances in eHealth, clinics need to install a telemedicine solution sooner rather than later or give competitive advantage to those that adopt now. This is especially true if patients and colleagues are in other geographical areas. In fact, government services are already active for eligible aged-care homes and to patients of Aboriginal Medical Services throughout Australia.

Technical Dr. Inc.'s insight:
Contact Details :

inquiry@technicaldr.com or 877-910-0004
www.technicaldr.com

more...
No comment yet.
Scoop.it!

Windows 10: No More Monthly Patches

Windows 10: No More Monthly Patches | IT Support and Hardware for Clinics | Scoop.it

For its soon-to-be-released Windows 10 operating system, Microsoft will abandon its longtime practice of issuing a batch of "Patch Tuesday" product and security updates once per month. Instead, the company will begin offering 24/7, cloud-based patching, which will become the new default for consumers. For the enterprise market, a new Windows Update for Business will enable IT managers to take advantage of these anytime updates or define their own patch-release schedules.


Those are just some of the new Windows 10 features announced this week at Microsoft's Ignite conference in Chicago. Windows 10 could ship as early as summer 2015 for PCs, the company says, but the OS will launch later for smartphones, tablets, the Xbox and other devices. The operating system is the successor to Windows 8 - Microsoft skipped "Windows 9" - which was released in late 2012.


"Windows 10 follows the path first taken by the smartphone sector where iPhones, versions of Android and Windows Phones pioneered getting updates delivered to users as soon as they become available," says Wolfgang Kandek, CTO of security firm Qualys. "This strategy has worked out exceptionally well when it comes to security." Indeed, Verizon's 2015 Data Breach Investigations Report found that a scant 0.03 percent of smartphones get infected with "higher-grade" malicious code, which is orders of magnitude below PC infection rates.


But some notable Windows 10 security questions as yet remain unanswered. Microsoft has yet to reveal if its cloud-based approach to updating devices will work with just Windows 10, or also with Windows 7 and Windows 8. It's also unclear whether Windows Update for Business will replace the widely used Windows Server Update Services.

Windows 10 Security Overview

Ahead of the new operating system's debut, Terry Myerson, executive vice president of Microsoft's operating systems group, took to the stage in Chicago to describe four key information security areas that are being addressed in Windows 10:

  • Device protection: Hardware-based Secure Boot can restrict the types of software that load when the device is powered on. A new Device Guard can be set to only allow a "white list" of approved applications to run, backed by Hyper-V, a native hypervisor that creates virtual machines. And Microsoft is touting a "new device health capability" that ensures endpoints are free from malware and bugs, and fully updated, before they're allowed to connect to enterprise resources.
  • Identity protection: Microsoft says the Windows 10 Passport - which also uses Hyper-V - can protect credentials and handle secure authentication with networks and websites without sending passwords, thus providing a defense against phishing attacks. The new Windows Hello feature, meanwhile, allows for biometric access controls via faces or fingerprints.
  • Application protection: Microsoft will certify the security of applications purchased via its Windows Store for Business. Businesses can also set Device Guard to only allow those certified applications to run on a device. All applications will also be restricted to only using kernel-level drivers that are digitally signed by Microsoft. "Windows 10 will not allow older drivers to run unless fully compatible with Windows 10," says Sean Sullivan, security adviser at anti-virus vendor F-Secure. "Microsoft expects developers to tighten up their old code ... which is better for both security and the user experience."
  • Information protection: Enterprise Data Protection can be set to automatically encrypt all corporate data, including files, emails and website content, as it arrives on the device from online or corporate networks.
Security-Only Patching

With the introduction of Windows 10, Microsoft is also planning big changes to how Windows devices can be updated.

One notable change centers on updates for mission-critical systems - such as medical equipment or the supervisory control and data acquisition systems that power factories and refineries - that must never be allowed to crash, and for which IT managers thus often never install any Windows updates. As a result, such devices are often at risk from exploits that target known vulnerabilities.


With Windows 10, however, Microsoft will now issue "Long Term Servicing Branches" that will "contain only security updates, without any functional updates," Microsoft's Myerson says. That way, businesses should be able to keep these mission-critical systems patched against attacks that target known flaws, without worrying that various feature changes or upgrades will crash the system.

Windows Update for Business

With Windows 10, businesses will also have new types of patch-distribution capabilities, via Windows Update for Business, which Myerson says will be a free service for business-focused Windows Pro and Windows Enterprise devices. Windows Update for Business will offer four options that are designed to make updates easier and less expensive to manage, while also enabling IT managers to get security and functionality updates into users' hands more quickly:

  • Distribution waves: IT managers can specify update waves, so critical devices get untested patches first. Others could be set to still receive monthly patch updates. F-Secure's Sullivan says that this "looks like good stuff," because it will allow businesses to reduce the time they need to patch enterprise systems.
  • Maintenance windows: Patch managers can specify when updates should - or should not - occur.
  • Peer-to-peer delivery: P2P can be used to get updates to remote offices or workers. "The peer-to-peer distribution model for these updates will help with connectivity bottlenecks," Kandek says. "It's an attestation to the power of this networking technology which has been well tested in gaming and video distribution."
  • Integration: Microsoft says the new patching capabilities will work with existing systems management tools that handle patching, such as System Center and the Enterprise Mobility Suite.
Goodbye, Patch Tuesday

Windows 10 marks a big change to Microsoft's policy of releasing patches in monthly batches, which dates back to 2003. The rise of agile programming has changed businesses' and consumers' expectations about how - and how quickly - their software should receive updates.


Some vendors now patch and release fixes for flaws in a matter of days, or less. At the annual Pwn2Own hacking contest, for example, after security researchers demonstrate new flaws in widely used software products, Google and Mozilla regularly issue patches for those vulnerabilities in their Chrome and Firefox browsers in less than 24 hours.


Recent versions of those browsers have been built using agile development techniques - including rapid development "sprints" - that might see new versions of an application get released at least every few weeks. Coupled with those browsers having the ability to automatically receive and install updates, these more frequent releases allow developers to patch products more frequently, and that's led some companies, including Google, to adopt more rapid patching as the norm.


With Windows 10, Microsoft is positioning itself to embrace these techniques too, in part via its new "Microsoft Edge" browser, known previously by its "Project Spartan" code name.


"For enterprises, IT teams there do have the option to continue with tighter patch control and testing," Kandek says. "However, I don't doubt that most IT teams will see the advantages of shifting over to the new model, as it supports fast patching on the desktop level. More and more, our desktop PCs and laptops have become pure Internet-connected workstations that will have no dependencies on legacy applications that force the use of outdated software versions, so the old model for patching becomes less relevant over time."


more...
No comment yet.
Scoop.it!

Why Mobile Cloud Will Become the Great Disruptor

Why Mobile Cloud Will Become the Great Disruptor | IT Support and Hardware for Clinics | Scoop.it

One by one, more industries are being disrupted by the strategic use of mobile and cloud technology plays, as savvy new competitors seek ways to shift consumer preference to favor their online digital offering.

The emerging Mobile Cloud phenomena will eventually disrupt the enterprise IT arena. Meanwhile, it will certainly continue to change the dynamics of the video entertainment sector.

According to latest worldwide market study by The NPD Group, mobile gamers -- those who play on a smartphone, iPod touch, or tablet -- are playing more often, and for longer periods of time, than they were two years ago.

The study uncovered that the average time spent playing in a typical day has increased 57 percent to over two hours per day in 2014 versus one hour and 20 minutes in 2012.

The growth of the media tablet market has seen these devices become central to the mobile gaming story. New and improved devices enable the transformation of creative online gaming experiences.

Not only are they the devices that are being played on the most, but tablet gamers are also more likely to pay for games and to spend more money on average than gamers on other mobile platforms.

"Continued mobile growth will stem from existing customers paying more to play, especially in the free-to-play portion of the market," said Liam Callahan, industry analyst at www.npd.com/">https://www.npd.com/" target="_blank">NPD Group.

The average number of playing sessions are at their highest from ages 6 to 44. However, the average number of minutes per session peaks in the tween years, then falls through the teenage and early adult years.

As an example, children ages 2 to12 are spending the greatest proportion of their device time on gaming versus other activities. This age group is also playing more games (average of 5 games), as well as more games that were paid for (average of 3 games).

The average amount of money spent by this age group over the past 30 days on new games, and in-game purchases is also one of the highest, second only to mobile gamers in the 25 to 44 age group.

The majority of mobile gamers are also playing video games on other platforms or devices, with only one-in-five players being mobile-only gamers. That said, regardless of the number of devices used to play games, mobile devices have the greatest amount of play time.


more...
No comment yet.
Scoop.it!

Looks like Microsoft is outsourcing a part of its cloud

Looks like Microsoft is outsourcing a part of its cloud | IT Support and Hardware for Clinics | Scoop.it

CDN expert Dan Rayburn reported Monday that Microsoft is deep-sixing its home-grown content delivery network capabilities and is instead turning to Verizon EdgeCast to deliver video for Microsoft Azure customers. Verizon bought Edgecast for its media delivery expertise, in late 2013.

Reached for comment, a Microsoft spokesman provided a limited confirmation: “Microsoft licenses technology from many partners to complement our product offerings and to give customers complete solutions. We are happy to partner with EdgeCast to provide an integral component of the Azure Media Services workflow.”

Delivery of content of all kinds, including bandwidth-hungry video, has been a priority for the Azure forces. Microsoft trumpeted the use of Microsoft Azure Media Services to help live stream Winter Olympic events from Sochi, for example, but that effort relied on CDN market leader Akamai.

“While azure did have some CDN services of its own before shutting them down, but they were basic, Rayburn said via email.

“Partnering with Verizon’s EdgeCast gives Azure more CDN functionality, greater reach and capacity and allows Azure to get all of the advantages of one of the best CDNs in the market, without any of the major capex or opex challenges. It’s a smart move on Azure’s part,” he said.

As Rayburn pointed out on his blog, Amazon builds almost everything in its cloud from foundation to rooftop. Microsoft, on the other hand, is more partner-focused and thus more inclined to license or buy technology.

And, don’t forget, Microsoft is also playing cloud catch up to Amazon Web Services, which, having launched in 2006, has a multi-year head start over competitors. Azure, in its first PaaS-based incarnation launched in 2010, but the more AWS-comparable version kicked off in 2013.

When you’re behind in the race, buying in could be a way to make up for lost time.

more...
Scoop.it!

Top Security Threats Still Plaguing Enterprise Cloud Adoption - Redmondmag.com

Top Security Threats Still Plaguing Enterprise Cloud Adoption - Redmondmag.com | IT Support and Hardware for Clinics | Scoop.it

As cloud computing moves beyond the early-adopter stage, security and privacy concerns and the inherent risk of moving assets off-site are not just fears -- they're real. Uncertainty about data security and privacy slowing the adoption of cloud computing existed before last year's revelations by Edward Snowden of covert government surveillance, but the scope accentuated skepticism, coinciding with the rise of cyber attacks from around the world.

"Edward Snowden's revelations were really a wake-up call for the industry about what the government can do with your data," says IDC analyst Al Hilwa. "And if the government can see your data, who else can? It's really not surprising that security concerns have slowed enterprise adoption."

Those fears notwithstanding, they're unlikely to put a major dent in projected adoption of public cloud services in the coming years. Gartner Inc., for example, predicts cloud computing will constitute the bulk of new IT spending by 2016, and that nearly half of large enterprises will have hybrid cloud deployments by 2017. However, the results of a recent survey by U.K.-based communications services provider BT Group of IT decision makers in large U.S. companies underscore a contradiction: 79 percent of respondents said they're adopting cloud storage and Web applications in their businesses, but they also report their confidence in the security of the cloud is at an all-time low.

Top Security Threats
The lack of confidence is with good cause. The Cloud Security Alliance (CSA) has identified what its researchers believe to be the top nine cloud security threats. Data breaches top that list, dubbed "The Notorious Nine". Also on that list are data loss, service traffic hijacking, insecure interfaces and APIs, denial-of-service attacks, malicious insiders, cloud services abuse, insufficient due diligence, and shared technology vulnerabilities. The company emphasized those risks at a three-day conference in September hosted jointly by the CSA and the International Association of Privacy Professionals (IAPP).

Not on that list, but another major risk, is the ease with which employees can and typically do bypass IT departments when using cloud services, says Jim Reavis, founder and CEO of the CSA. Today, anyone can use a credit card to spin up a virtual machine on Amazon or Microsoft Azure, set up a SharePoint instance via Office 365 or another third-party provider or by using free services such as Box, Dropbox, Google Drive or Microsoft OneDrive. Reavis points out that when people bypass IT when using these and other services, it undermines business-level security policies, processes, and best practices, making enterprises vulnerable to security breaches.

Another risk Reavis points to: the lack of knowledge by IT management of the scope of cloud usage in an organization. At the CSA Congress 2014, the group published the results of a survey of U.S. companies, many of which drastically underestimated the number of cloud-based apps running in their organizations. The report concludes, "Cloud application discovery tools and analytical tools on cloud app policy use and restrictions are crucial in the workplace, especially when it comes to sensitive data being used by these cloud applications. With sensitive data being uploaded and shared by these apps with authorized and unauthorized users, policy enforcement becomes a major role in protecting your data."

The report estimated with more than 8 billion Internet connected devices, a growing number of businesses may own data, but no longer own their infrastructure. "A few years from now, that 8 billion will become a quarter trillion," Reavis says. "If we lose ground on privacy and security today, we'll have a very hard time getting it back. That creates a mandate to embrace the tools and technologies that are emerging to manage and protect these resources."

The proliferation of all those devices and the bring-your-own-device corporate culture has resulted in an enterprise that's more difficult than ever to protect -- cloud or no cloud, says C.J. Radford, VP of Cloud at data security company Vormetric Inc.

"The perimeter has failed or is failing, given that data is now everywhere," Radford says. "If you're only focused on your perimeter, you're going to have a very hard time protecting your data. But that's where the enterprise has traditionally spent its money over the past 10 or 15 years -- essentially, on building a bigger moat. The problem is, you can't build a moat around, well, everything."

Controlling Access
In an increasingly cloud-centric, perimeter-less world, enterprises must concentrate their security efforts on protecting the data itself, Radford says. His company partners with leading cloud vendors, including Amazon Web Services Inc., Rackspace, IBM Corp., and Microsoft, to provide data-at-rest encryption, integrated key management, privileged user access control, and security intelligence logging. Among other things, the Vormetric Key Management Key Agent software works with Microsoft SQL Server Transparent Data Encryption (SQL Server TDE) to help manage SQL encryption.

"Today, it's all about controlling data access," he says. "If you read any of the major breach reports, one of the ways the bad guys are getting access to data is compromising privileged username and password credentials. They're doing it through social engineering, phishing and that sort of thing."

Not surprisingly, Radford is a strong advocate of data encryption, and he also recommends a bring-your-own-key (BYOK) approach. "You should never rely on the provider to manage your encryption keys," he says.

"BYOK means the provider can turn over your data in encrypted form, but it's useless without the key. The other thing it buys you is the ability to `digitally shred' your data. We call that `permanently securing your data.' That's why we always say, rule No. 1 in encryption is never lose your key."

Encryption support is even showing up above the infrastructure level. Azure, Outlook.com, Office 365 and OneDrive, for example, are now supported by Transport Layer Security (TLS), Microsoft announced last summer. The encryption support covers inbound and outbound e-mail, as well as Azure ExpressRoute, which allows users to create private connections among Azure data.

Data encryption and data-centric solutions seem to be especially appealing to enterprises in the post-Snowden era, says Luther Martin, chief security architect for Voltage Security Inc.

Martin believes the primary cloud security concern in the enterprise today is availability.

"If you look at the data, in terms of frequency, most of the cloud incidents so far have been about service outages," he says. "The outages have been relatively short, but they can be terrifying, and there's not much an enterprise can do about them."

He also notes, however, that encryption keys present their own challenge -- namely, keeping track of them. "Effective encryption key management is hard," he says, "and people often don't give it the consideration it deserves. I mean, if you lose a key, you've lost your data, too."




Via Michael Dyer
more...
No comment yet.
Scoop.it!

The Biggest Thing in Cloud Computing Has a New Competitor | WIRED

The Biggest Thing in Cloud Computing Has a New Competitor | WIRED | IT Support and Hardware for Clinics | Scoop.it

Docker is the hottest new idea in the world of cloud computing, a technology embraced by Silicon Valley’s elite engineers and backed the industry’s biggest names, including Google, Amazon, and Microsoft. Based on technologies that have long powered Google’s own online empire, it promises to overhaul software development across the net, providing a simpler and more efficient means of building and operating websites and other massive online applications.

But some of Docker’s earliest supporters now believe that the company behind the technology, also called Docker, has strayed from its original mission, and they’re exploring a new project that aims to rebuild this kind of technology from scratch.

On Monday, a San Francisco startup called CoreOS unveiled an open source software project called Rocket, billing it as a Docker alternative that’s closer to what Docker was originally designed to be. “The original premise of Docker was that it was a tool that you would use to build a system,” says Alex Polvi, the CEO and co-founder of CoreOS, a company that has been one of Docker’s biggest supporters since the technology was first released early last year. “We think that still needs to exist…so we’re doing something about it.”

The project is still a long way from complete—CoreOS is open sourcing an early version of the technology in the hopes that others will help define and build it—but some notable names are already eyeing the technology, saying that it could help fill some holes in the cloud computing landscape. Craig Mcluckie, who helps oversees Google’s cloud computing services, calls the Rocket project “interesting,” saying that, if it continues to progress, Google will consider contributing to the project.

A Shipping Container for the Internet

You can think of Docker as a shipping container for the online universe, a tool that lets developers neatly package software and move it from machine to machine. Today, when running large online applications such as a Google or a Twitter or a Facebook, developers and businesses often spread software across dozens, hundreds, or even thousands of machines, and Docker provides a more efficient means of doing so.

It’s based on technology built into the Linux open source operating system—technology that Google has long used to more efficiently run its online operation, the largest on the net—and it seeks to provide a standard way of using this technology, something that any developer can use across all their own machines as well as atop cloud computing services from the likes of Google and Amazon, services that let them run software without setting up their own machines.

But for Polvi, Docker is no longer the simple container format it was originally designed to be. The trouble, he says, is that the software that runs Docker containers—known as the Docker Engine or Docker runtime—has evolved into something that’s far more complex than it was in the past. “If you pay attention to all the things that Docker is doing, currently, it’s really evolving into more of a platform, rather than a container used to build a platform,” he says.

Basically, Docker the company is packing the Docker runtime with all sorts of software designed to help developers run complex applications, and Polvi believes the technology should remain a simple building block for online applications. For instance, the Docker runtime now includes software for running containers across a large cluster of machines—software that behaves much like separate tools offered by Google, another San Francisco startup called Mesosphere, and other companies—and in this way, Polvi argues, it now competes with its own partners.

So, Polvi and CoreOS have built a new container format known, appropriately enough, as App Container, and they’ve created a runtime for this format called Rocket. They fashioned this software with an eye towards added security, but the main aim, Polvi says, is to return the Docker idea to its roots. “It’s time to rein things back in,” he says, “with an alternative.”

As Google indicates that it could contribute to the project in the future, Matt Trifiro, the head of business development at Mesosphere, says that the company’s engineers have already helped shape the project. Though Google’s Mcluckie says the project is still “nascent,” he, like Polvi, believes that a container technology should remain as a simple, modular technology that can be used to build much larger things. Trifiro argues much the same. “You need a nice, self contained unit that doesn’t have a lot of dependancies,” he says. “We think that [Rocket] serves a unique function in the marketplace—and we think it’s going to take off.”

‘Competitive Tension’

Cloud computing software and services from Google and Mesosphere will continue to run Docker containers, and both companies are careful to say they do not intend to split with the Docker project. You’ll hear much the same from Pivotal, another notable cloud computing company that’s eyeing the Rocket project. “Linux containers are important for the industry and we will collaborate with anyone on open standards to incorporate the lessons Pivotal has learned from years of running containers in production,” says Pivotal’s Andrew Clay Shafer.

On some level, these companies are trying to navigate what Polvi calls “competitive tension.” Though Docker the company is now offering tools that seem to compete with their own, Google and Mesosphere still believe in the basic idea of Docker—and they know it has enormous support from the wider community of internet developers.

Like Rocket, Docker is an open source project, but in some ways, it’s controlled by the company behind it. Rocket is an attempt to create a project that is “more open,” that allows for contributions from a much wider community. Polvi says, for instance, that anyone can build their own runtime that works with the App Container format.

This sort of thing often happens in the world of open source software, where outside developers and companies will grow dissatisfied with the direction of a project and start their own, hoping to better serve their own needs. CoreOS offers an new version of the Linux operating system that aims to simplify large online applications in myriad ways, and Polvi believes that the company’s customers are better served by the new project. But Polvi would be just as happy if Docker evolves into something that’s closer to Rocket.

“We intend to collaborate with Docker. We can contribute these ideas back to Docker,” he says. “It’s just that having an independent implementations will help things go faster.”



more...
No comment yet.