7 Ways to Effectively Utilise Big Data in Organisations

7 Ways to Effectively Utilise Big Data in OrganisationsThe buzz around Big Data is undeniable. Regardless of the size of the organisation, managers can use this information, to help drive better, more effective organisational decision making, as a result of accurate analysis.


1. Improve Business Intelligence

Business Intelligence is a process of analysing data which helps managers and corporate executives make more sound business decisions. So if you try to put in some extra effort to ameliorate your organisation’s business intelligence, it will result in a more accelerated decision-making process, optimised internal business processes, increased operational efficiency, generation of new revenues, and identification recent market trends.

2. Practical Business Decisions Based On Customer Behaviour

Big Data contains a wealth of information about the way customers of a particular organisation act and behave, like their interests, habits, and demographics in some cases. By analysing sales, market news and social media data, organisations may collect and analyse real-time insights of their customers.

This article originally appeared in procurious.com .  To read the full article, click here

How Utilities Are Deploying Data Analytics Now

How Utilities Are Deploying Data Analytics NowUtilities are sitting on a wealth of opportunity from data analytics, with more information than ever before flowing from smart meters and other sensors, along with traditional sources of data about their operations. Most utility executives see the potential to mine this data for insights, even if they aren’t quite sure how or where to start. They know these discoveries will eventually generate tremendous value for their organizations, but they tend to think this will be years away, after enormous investments to enhance systems and improve the quality of data.

The good news is, they don’t need to wait. Utilities already have access to data and tools that they can use to begin deploying production analytics and generating insights that create value. They are improving their ability to analyze data and understand their business. For example, some have already sharpened their accuracy in predicting equipment failures and power-outage durations—results that can reduce costs and increase customer satisfaction.

This article originally appeared in bain.com .  To read the full article, click here.

How Analytics Is Aiding Banking Compliance

How Analytics Is Aiding Banking ComplianceRegulations are costly and time consuming for banks, and they need to stay on top of their data.

Banks struggled to put this information together, highlighting a complacency and malaise that likely exacerbated the problems of the crisis. Lehman Brothers’ collapse heralded the beginning of a new era of regulations, though, with Dodd Frank, which was introduced in 2010, and Basel III in 2011 among the most far-reaching and complex. Over $100 billion in fines have been paid in US for non-compliance since 2007, and with a new Republican-led regime entering power, it is unclear what the future holds.

The time and cost of regulatory compliance and reporting vastly increases with every new regulation. Regulatory bylaws must, by their very nature, be thorough, and many contain hundreds of pages of information. Keeping up with these causes additional stress to financial services institutions, at a time when new competition from FinTech is creeping up the sides.

This article originally appeared in Innovation Enterprise .  To read the full article, click here.

How Retailers Use Big Data to Gobble Up Sales

How Retailers Use Big Data to Gobble Up SalesBillions of dollars will be spent this weekend as the Thanksgiving holiday gives way to Black Friday, Cyber Monday, and the beginning of the end-of-the-year shopping extravaganza. For retailers eager to get “back in the black,” big data analytics provides a great opportunity juice profits by successfully converting on ample sales opportunities.

The volume of sales this weekend is expected to be massive. According to the National Retail Federation, 137.4 million Americans are expected to shop online or in stores over the four-day holiday, up from 135.8 million last year. Consumer confidence is high, thanks to a 5.2% increase in the median income of American workers (per the Census Bureau) and all-time highs reached on stock market indices.

This article originally appeared in Datanami .  To read the full article, click here.

3 Ways to Solve App Performance Problems with Transaction Tracking – Part 2

3 Ways to Solve App Performance Problems with Transaction Tracking - Part 2In part 1, I discussed the important of tracking applications and how that is similar to tracking packages.   However, there is one significant difference between the two, applications don’t have bar codes.  The collecting of the tracking events as the data moves through the application requires additional processing.   There are many techniques for doing this.   The application can generate the events itself, in the form of a log or audit trail.   But in cases where it doesn’t, instrumenting the underlying system is an option, depending on what facilities it provides.   Given an application that executes through DataPower, IIB (Broker) and MQ, Nastel leverages several techniques to create the tracking events.


DataPower can act as a front end or an intermediary node.  These flows are one of the key ones that require visibility.  Unfortunately, there is simple no place to do that centrally.  We have found that the best way is to make slight modifications to the flows to collect the required data and send this as tracking events. Using this method, we can track very granular detail of flows that go through as well as failures or performance problems in the flow.  Many application flows already have some form of built in tracking that can be easily leveraged as well.

IBM Integration Bus / IIB (Broker)

The Broker supports a very rich mechanism for tracking the Message Flows.  Without changing the internal structure of the flows as required in DataPower, you can still get to that level of detail, including

  • Transaction Start / Stop (default)
  • When a given node was processed
  • Message content being processed by the flow
  • Track message flows in and across brokers

You have controls within the broker to determine what type of data is sent.  With the Broker, you have the ability to configure more detail about the information you want to send.  Data collected is published to broker topics, which are then forwarded as tracking events.


IBM MQ provides 2 options to collect the data depending on versions of MQ.

Using MQ API Exits

Available in all versions of distributed MQ, MQ API Exits can be used to capture information as it flows through the MQ environment.  When an application is invoked, the queue manager will pass information about the application to the exit program.  The exit program will look at the application call and data to decide what to do with this information. This procedure allows us to track information as it flows through the application environment and across the sending and receiving channels to multiple queue managers (distributed and mainframe)

Tracking Using Application Activity Trace

With MQ 7.1 and above, the queue manager can be configured to generate the tracking events.  The MQ appliance uses this method exclusively.  The data collected is the same as when using MQ Exits.  The activity trace method has some advantages over the MQ Exit approach including no need to install code into the queue manager, easier to enable and disable and easy to setup for remote access.  But it currently supports limited filtering on the host MQ server which can mean potential for increased network traffic.

Independent of which method is used, the tracking events provide the information need to see inside of the MQ processing.

Managed File Transfer (MFT/FTE)

Many customer are currently leveraging managed file transfer into their applications to integrate files with MQ flows and broker (IIB).  The MFT coordinator publishes tracking events to record this activity. This allows you to see the transfer and any failures.


As noted in the introduction, the goal is a combined flow across the environment.  You need visibility into one or more of the technologies such as browsers, mobile, DataPower, Broker, Message File Transfer, MQ, Java applications and many more.

Nastel AutoPilot TransactionWorks analyzes this tracking information, interprets the data and produces the global transaction mapping.  When the events track across all of these technologies, we can provide a complete picture of the application flow.  through multiple environments.

Read Part 1 in this 2 part series: “3 Ways to Solve App Performance Problems with Transaction Tracking”.

To learn more, watch the TechTalk here

How CEOs Can Keep Their Analytics Programs from Being a Waste of Time

How CEOs Can Keep Their Analytics Programs from Being a Waste of TimeDespite billions of dollars invested in big data and analytics, the simple truth is that most projects and programs fail to meet expectations. And we have figured out why: analytics forces changes on the C-suite that the CEO has to anticipate and manage, but many don’t.

From how we choose presidents to what movies we choose to watch, big data and analytics have become integral parts of our lives. But for too many companies, analytics is an unsolved puzzle with the pieces flung all over the floor.

This article originally appeared in hbr.com .  To read the full article, click here.

Give Predictive Analytics a Boost with the Human Touch

 Give Predictive Analytics a Boost with the Human TouchCultivating customer loyalty is a top priority for any business, and most brands today believe that if they can preemptively gain a better understanding of customer needs, they can take subsequent steps to anticipate and fill those needs to create a more positive customer experience.

To accomplish these goals, companies often collect as much customer data as possible to analyze consumer behavior, identify patterns, and use analytics to predict what their customers are likely to do next.

Predicting Individual Behavior

Bottom line, even when companies have the best of intentions when using predictive analytics to anticipate consumer behavior, they also need to be right on each individual case.

For example, companies may use predictive analytics to find customers who are exhibiting the kinds of behaviors that might indicate they’re thinking of taking their business elsewhere. Those companies can then take action to encourage dissatisfied customers to stay before they actually leave. However, not every customer may respond to that action in the same manner.

This article originally appeared in cmswire .  To read the full article, click here.

Figuring Out How IT, Analytics, and Operations Should Work Together

Figuring Out How IT, Analytics, and Operations Should Work TogetherA new set of relationships is being formed within companies around how people working in data, analytics, IT, and operations teams work together. Is there a “right” way to structure these relationships?

Data and analytics represent a blurring of the traditional lines of demarcation between the scope of IT and the responsibilities of operating divisions. Consider the core mission of the modern IT department: Taking in all the technology “mess” (often from several different divisions), developing the necessary competencies, and delivering savings and efficiency to the company. Many IT organizations, having achieved this original mission, now are turning to the next thing, which is innovation.

Enter data and analytics, which provide an opportunity for such innovation. However, data traditionally is owned by the business, and analytics is valuable only to the extent that it is used to make business decisions, again “owned” by the business. For IT to operate in the data and analytics space often takes realigning roles and responsibilities.


This article originally appeared in hbr.org .  To read the full article, click here.

Four Roadblocks to Becoming Data-Driven, and How to Overcome Them

Four Roadblocks to Becoming Data-Driven, and How to Overcome ThemToday’s most competitive companies are data-driven. Consider Uber, which assailed the taxicab industry in San Francisco in just a few years, without owning a single taxicab. How? Among many innovations, Uber brought data to the taxi industry. Using historical data, Uber advises drivers to be in certain hotspots during certain times of day to maximize their revenue because customers tell them with the push of a button where to be.

These companies don’t rely on hunches, siloed spreadsheets, or data on rogue servers to make decisions; instead, they have operationalized data as a part of every process and decision and built cultures where guesswork doesn’t suffice. Operationalizing data, or using data to improve business performance, will be the defining competitive advantage of the future.


This article originally appeared in DataInformed .  To read the full article, click here.

Success Story 5 Retailers Who Placed Big Bets Over Big Data Analytics

Success Story 5 Retailers Who Placed Big Bets Over Big Data AnalyticsThe game changing technology- big data has gone mainstream that collect, organize and get value out of the overwhelming amount of data on-the-fly. Retail industry is taking maximum advantage of this technology to better tune their services according to customer’s needs.

According to McKinsey reports, “Retailers are using big data to improve operations across the board, including merchandising, marketing, e-commerce and multichannel, supply chain, and store management, and operations.”

The simple and fast way to identify the valuable opportunities and start with customer decision journey enabled leading E-commerce retailers to hit the wall. These online retailers set the great example before the retail market and achieved tremendous growth by counting on big data analytics.


This article originally appeared in BrainVire .  To read the full article, click here.