The Big Data Difference: Predictive Analytics

The Big Data Difference: Predictive AnalyticsWhat if you could know in advance which patients would benefit from certain therapies? Or identify patients approaching a medical crisis and intervene before it’s too late? While doctors have traditionally had to rely on instinct to make these calls, predictive analytics could be a game changer for hospitals, healthcare providers and patients.

The Power of Prediction

Medical sensors and data analytics can be used to power medical devices that can predict adverse outcomes before they occur. By analyzing very large data sets, researchers can identify subtle markers, such as small changes in vital signs or patient behaviors that can be correlated to development of serious conditions like heart failure or kidney failure. If we can learn to look for the right signs, we can develop an early warning system for imminent medical crises.

Combining data analytics with body-worn or implantable medical sensors will allow us to better monitor patient health. These sensors can pick up subtle changes in biometrics, biomarkers and other patient data over time. Using predictive analytics, smart sensors could use these readings to detect early warning signs of kidney failure, stroke, heart failure and other medical crises, alerting healthcare providers before adverse events occur. Data analytics could also be used to power smart apps or devices that provide ongoing guidance to patients in response to sensor data in order to help them better manage chronic conditions.

Continue reading

DevOps: Keep the Business in Business

DevOps: Keep the Business in BusinessThe concepts of DevOps are not new: How do you best manage the technology (people, processes and the fun stuff) that enables a business or mitigates obstacles? By blending the practices from Agile, Lean and ITSM, you create a better, faster, safer way of working across the organization.

If technology sustains and improves business processes and customer products, then it stands to reason that DevOps and business continuity work hand in hand.

DevOps encourages you to think in terms of the entire value chain of events. Release management, for instance, is the chain of events that ensures what is being created is delivered for use as soon as possible and with high quality, and is able to be supported. Sometimes, processes work together such as budgeting (how to use money and other resources) and funding (using money or other resources).

If you use Agile practices, then you already have the basics of release, deploy, test, change and support processes integrated. Why not add an aspect of Business Continuity Management to your functional requirements and user stories, or to your “what happens if” scenarios? In this way, from the beginning of the chain to the end, everyone involved can look for, design against, test and be ready to support those occasions.

 

View Source

Are You Focused On The Right Analytics?

Are You Focused On The Right Analytics?The Analytics continuum: Descriptive Analytics → Predictive Analytics → Prescriptive Analytics. Are you focused on the right one?

Let’s assume I am a marketing analyst for one of the largest discount retailer in the market. My company has invested heavily into one of the largest analytics, (actually, I meant descriptive analytic tools in the market), name ends with a U. I look at trends as soon as data is loaded into the tool. My end consumers include the VP of Marketing, the VP of Category Management, and also the Finance team. The Sales team also has access to my reports. I am the front line for all of these folks.

Every quarter, we have a planning exercise which starts with all of the above participants getting together in a room and talking through plans for that quarter. I start by providing each team its most updated and refreshed dashboard. Based on sales and performance seen in these reports we determine products, sub-segments, and promotions for the new quarter. The only problem is that once we have some conclusions, we then ask the predictive analytics team to run some scenarios for us. That is complicated because they have to update their data sets and then run a few different simulations around some decisions we make. Most of the time we have to change one or more variables and re-run simulations. There are about 6-10 people involved in this exercise. Sometimes we get to the specific answer that we all have agreement on, but it requires a lot of interactions between the different teams.

Sometimes, because of the complexity of all these steps, we rely on our experience to come up with the plan more than any prescriptive analytics!

We claim to be data driven, but it’s a little complicated sometimes. So we ‘kind-of-use reports’, but really we are using our own business acumen to make the final call.

The problem with the above analytics continuum for this marketing analyst is that it is setup for failure. The core reason why people spend time on analytics is not to have intimate access to the latest reports! While knowing what just happened is always important and useful, in reality, it tells you how well things happened, or did not, it gives you an overview of financial performance in the past. However, what it often fails to comprehensively do is to focus on what you should do tomorrow. A lot of companies are spending a lot of time on descriptive analytics, not on the rest of the continuum. If the core goal of looking at analytics is to plan for the future, then this is where people should be focused! If the most important process is to come up with the plan for the future, next month, quarter or year, the prescriptive analytics part should happen first. In fact, the entire continuum should be inverted – and perhaps collapsed into one step.

 

 

View Source

 

If utilities moved to the cloud, would they use more renewables?

If utilities moved to the cloud, would they use more renewables?The cloud has been a game changer for various industries looking to embrace digitization. In the context of energy, it has the potential to significantly increase the ways in which clean and renewable energy resources can be made viable.

Yet many power utilities are lagging behind their industrial counterparts in embracing the cloud and cloud-based services. What’s holding them back?

Archaic regulatory structures and an institutional distrust of moving data over the internet are a couple of major inhibitors. In the past, the IT needs of utilities didn’t necessitate extensive infrastructure or staff such that the utility couldn’t afford to install and run everything in-house.

But industry-wide changes stemming from the global energy transition require a more robust and agile IT infrastructure — and for many utilities, this won’t be economically feasible without the cloud.

That global transition in energy can be characterized by two major trends:

  1. The growing demand for cleaner forms of energy such as renewables, distributed generation and energy efficiency; and
  2.  A changing set of customer expectations where customers require the same digital services from their power utility as they do from their cellular provider.

The cloud will help utilities create value in terms of operationally facilitating the increased use of renewables and clean energy. It also will support a better experience for customers by helping them to meaningfully reduce their energy use. In some cases, it even can improve energy efficiency for the utility.

Continue reading

The cloud computing effect: Better security for all

The cloud computing effect: Better security for allCloud computing offers lots of benefits, but improved security is not one that makes many IT lists. In fact, many — perhaps even most — IT pros still believe that cloud computing means a huge step backward in terms of security risk.

That doesn’t seem to be the case. About 10 percent of our workloads now run on public clouds, and so far, so good.

Why? Ironically, partly because IT has been so paranoid about public clouds that it spent time and money to implement advanced security approaches such as identity and access management and to be more proactive about security measures.

Moreover, public cloud providers themselves understand the importance of security. If they get one cross-tenant hack, they are done for. Thus, providers consistently and proactively update security systems. Most enterprises would like to do the same, but they don’t have the time or the budget, which leaves them comparatively more vulnerable.

Continue reading

Big data car pool a traffic jam saver

Big data car pool a traffic jam saverThe rise of big data has given new hope that car pooling could be the solution to opening up Australia’s gridlocked city roads.

From disrupters like Uber to government officials, the potential of massive, detailed trip-mapping data is generating hope that commuters making the same trip each day might share the journey rather than each making their own way in their own road-hogging car.

From Uber’s new car pooling service to giving freight trucks a green light run through cities the effective use of data and technology could free up billions in wasted productivity, transport experts and operators believe.

Continue reading

How predictive analytics discovers a data breach before it happens

How predictive analytics discovers a data breach before it happensPredictive analytics is the science that is gaining momentum in virtually every industry and is enabling organizations to modernize and reinvent the way they do business by looking into the future and obtaining foresight they lacked previously.

This rising trend is now finding its way into the domain of cybersecurity, helping to determine the probability of attacks against organizations and agencies and set up defenses before cybercriminals reach their perimeters. Already, several cybersecurity vendors are embracing this technology as the core of their security offering. Here’s how predictive analytics is changing the cybersecurity industry.

The Difference Between Big Data and Smart Data in Healthcare

The Difference Between Big Data and Smart Data in Healthcare“Big data” is one of those terms that gets thrown about the healthcare industry – and plenty of other industries – without much of a consensus as to what it means. Technically, big data is any pool of information that is compiled from more than a single source.

For healthcare organizations, this could mean creating a database that takes patient names and addresses from one system and matching it up with scheduled appointments from another, or integrating claims data with clinical notes from the EHR.

Stitching multiple sources of information together into a centralized databank accessed by reporting or a query system can provide a more in-depth and actionable snapshot of a patient’s history, diagnoses, treatments, socioeconomic challenges, and risk profiles.

But leveraging these disparate data sources requires the right tools and competencies, which aren’t always easy to develop.

Electronic health records are starting to take big data analytics seriously by offering healthcare organizations new population health management and risk stratification options, but many providers still turn to specialized analytics packages to find, aggregate, standardize, analyze, and deliver data to the point of care in an intuitive and meaningful format.

These tools may include quality benchmarking and performance measurement systems, clinical analytics algorithms that monitor patients in real-time, revenue cycle and claims analytics, and population health management packages that foster engagement, deliver alerts and reminders, stratify beneficiaries, or gauge risk of a certain disease.

In addition to the right technologies, providers must invest time and manpower into acquiring the competencies to make analytics work for them.  This includes crafting a dedicated team of experts to oversee big data projects, implement and optimize software, and convince clinicians that these new strategies are worth their while.

 

View Source

Five Key Big Data Challenges Companies Need To Overcome When Developing A Big Data Strategy

Five Key Big Data Challenges Companies Need To Overcome When Developing A Big Data StrategyBig Data will Increase IT Dependency

In the past years, IT has become a lot more important for many organisations and in the coming years we will see, with the appearance of the Internet of Things and the Industrial internet, that many currently unconnected devices will become datafied and start generating vast amounts of data. For companies that develop offline products and only use IT for their website, this will mean a rigorous change. In the coming years, IT will become a central part of all business units. Big Data will infiltrate and affect all departments within your organisation and that requires a different way of working.

So apart from being able to access the data, Big Data will become an essential part of the different departments and consequently require IT staff of their own. For many organisations, IT will form a much more important aspect of their company and for currently ‘offline companies’ this is a major shift that needs to overcome.

Continue reading

EHRs and digital health tools ‘dramatically transforming’ care experience, patients say

EHRs and digital health tools 'dramatically transforming' care experience, patients sayNearly 75 percent of patients expressed a high level of interest in accessing their electronic medical records, according to new research, and 33 percent indicated that EHRs have already changed their experience for the better.

“The patient experience is dramatically transforming,” CareCloud CEO Ken Comee said in a statement. “Patients of all ages are actually embracing digital online patient engagement tools from scheduling appointments to accessing their medical records and making online payments.”

Continue reading