Maximo Hybrid Cloud

Are you looking to use a hybrid cloud option to host your IBM Maximo instance? IBM has a few thoughts regarding this option, as reported by Payal Chakravarty.

Hybrid cloud—the combination of using public and private cloud to host your applications and infrastructure—is becoming the de facto standard for cloud adoption rather than a transitory architecture. According to several industry surveys, 70 percent of organizations are using or evaluating the hybrid cloud today. There are two distinct directions in which the move to hybrid cloud is occurring: private to public cloud and public to private cloud.

Private to public cloud
Enterprises with big IT investments in their on-premises data center are gradually moving certain workloads to the public cloud to gain efficiency, scalability and agility. Those workloads that have sporadic demands are most commonly shifted to the public cloud. Bound by privacy and security restrictions, these enterprises tend to keep their data on-premises in their private cloud.

Public to private cloud
Companies that were born in the cloud and utilized public cloud to scale without spending on an IT organization are realizing that, over time, the public cloud costs add up. It’s like your daily coffee bill. You think you are only spending four dollars every day, but when you look at the cumulative bill at the end of the year, you will be shocked to find out how much you are spending on coffee. You might as well have bought a coffee maker.

(Related: Want to build, manage and secure a hybrid cloud solution? Learn more at IBM.com/cloud/expertise.)

Hybrid cloud applicationsAnother aspect of the move from public to private is that once user demand attains a steady state, the need for auto scaling is not significant. You can save significantly by running a private OpenStack in your own data center instead of running full scale production on a public cloud. These realizations have dawned upon several companies that have started to locate some of their workloads in-house to improve cost efficiency and gain more control over their data and resources.

The movement of workloads from private to public and vice versa is necessitating the need for a stable hybrid architecture.

With this movement, companies must investigate how to span applications across such a hybrid environment. An application typically runs on a stack that comprises:

• Front-end code, which may have dependencies on third-party services
• Middleware such as application servers, web servers, message queues and so forth
• Data sources such as SQL or NoSQL databases
• The operating system

The front end, which needs to auto scale to tune to user demand, is typically the best candidate for being hosted on the public cloud. The middleware and database can reside anywhere. However, the trend indicates databases are the most popular candidates for keeping in house.

To manage the lifecycle of a hybrid application, you need to be able to deploy, manage and monitor across the hybrid cloud. Several tools are evolving to handle this shift.

Tools like IBM Cloud Orchestrator let you define hybrid patterns to solve the deployment problem. This means you have a set of application resources and configurations that spans across your public cloud like IBM SoftLayerand your private cloud based on VMware. You can capture that mapping and configuration as a pattern and deploy that pattern in five minutes across both environments.

Additional tools like IBM Application Performance Management (APM) now let you manage and monitor your hybrid applications with stack components that reside anywhere. If your application’s front-end code is hosted on a public cloud like Amazon or IBM SoftLayer and its back end is a Tomcat server and MySQL database that resides in your own data center, IBM APM tools will discover the two and let you visualize them together in the same pane of glass so that you can see your application’s performance end to end.

Why Fact Based Data is Critical to your Maximo Implementation

After attending Pulse 2013 you are overwhelmed with ideas and visions from every direction; Cloud, Security, Instrumented Assets, Analytics, Mobile, and Social Media.  Big data is a big priority for IBM.  In other words a Smarter Planet.  But it doesn’t take long to descend from the cloud, get back to where the rubber hits the road and to try to relate what all of that means to many of us just trying to implement the basics of Maximo.  Clean, reliable, accurate data is the single most important aspect of your EAM because your credibility and the credibility of your business depend on making informed decisions based on accurate data.  40% of all Maximo implementations fail because they are not rooted in objective fact based data.

Data that Matters
How many of us have been in the situation of defining requirements and we ask managers or users what data they want and invariable they say: we want it all, whatever the system can provide we want it. The better question to ask is:

What data drives the decisions and improvements that you need to do your job.  Having it all is a recipe for disaster unless you have multi-million dollar budgets, a business need, and the resources required to gather, secure, analyze, and act on large volumes of random data.   The more you want, the more it cost, and the more complicated it is to get it. Collecting data just for the sake of it is a waste of time for most of us.  Don’t load Maximo up with dirty data because it erodes system credibility and the trust of the users.  Less good data is better than more bad data. Of course that leads to the argument that one man’s junk is another man’s treasure.  Make sure that whatever data is deemed necessary is justified with a business case.  Just because you can, doesn’t mean you should.  Don’t do things that don’t matter. Bad data leads to bad decisions.

Business Process Re-Engineering
When you finally define the data that matters, you need to determine how you currently get that data into the hands of those that need it.  Relentlessly scrutinize your business processes and get answers to the following questions:  Why you do, how you do, what you do, and who does it?

Analyze how you can do this better and streamline your business processes to support the data you need and the way you get that information to the right people.

“Things should be as simple as possible, but no simpler”

Albert Einstein

It is important to simplify the user experience.  If you don’t, you won’t consistently get the data that you need. Remove road blocks and meaningless steps in your processes.  Don’t make users do things that don’t matter to them. “Make it work like I do” by using smart phones, tablets, and configuring Maximo to support the user’s work processes.  Optimize the technology to take advantage of pushing/pulling data from users, reduce double entry, make it easy to capture what was done, and access what needs to be seen.

Realistic Expectations
My definition of success is if results meet expectations.  Having realistic expectations about the data is very important to how you interpret the successfulness of Maximo delivering what you need.  For example, you want to capture labor and material costs on a work order:

Case A –

  • User enters whatever hours and materials they want on the work order

Case B –

  • Labor hours entered in a time/attendance/payroll system integrated with Maximo work orders to capture labor hours/costs.
  • Materials are charge to work orders via issues from inventory
  • PO materials and contracts are charged to work order upon receipt

Two very different expectations in terms of the data being captured and the degree of accuracy.  Don’t expect Case B results if your data reality is Case A.  Remember the more you want, the more it cost, and the more complicated it is to get it.  You have to decide how important the data is, how much it will cost to get it, and set realistic expectations.

Why is this Important?
Because everything discussed above provides the design specifications for your Maximo Implementation:

  • Data requirements
  • Business processes needed to support data gathering
  • User expectations of the results to be delivered by Maximo

“If you don’t know where you are going, you will probably end up somewhere else”.

— Sidney Heyward

This is the foundation, roadmap, and the requirements for designing and building out Maximo to support your business needs.  As you build and test, you compare the functionality against the requirements and expectations.  “Going Live” is the reality check and of course never goes quite like we anticipate.  All of us have to deal with the changing winds, rough conditions, and political currents of our work environment that are constantly trying to take us off course.  Don’t lose sight of why we use Maximo:

  1. Asset Reliability – if your assets aren’t running you are out of business
  2. Improved Efficiency – streamline business processes to gather data
  3. Command and Control – your success depends on making informed decisions based on accurate data

____________________________________

About Randy McDaniel:
Randy has a B.S. degree in Mechanical Engineering from the California State University at Fullerton and has spent over 35 years in the field of maintenance engineering, maintenance planning, capital projects construction, and facilities maintenance. His industry experience includes oil refineries, petrochemical plants, universities, steel mills, assembly plants, lumber mills, and utility plants.

He has spent time as a Maximo senior consultant providing business process re-engineering assessments and managing Maximo implementations. A vocal advocate of Maximo, Randy has been the Chairman of the Southern California Maximo Users Group since 1998 where he often presents best practices, tips and other real life Maximo experiences.

Currently Randy is the Maximo System Administrator and Facilities Management Information Systems Integration Manager at the University of California Los Angeles. He manages the implementation of Maximo and provides IT integration direction and vision for the General Services business unit.

This post originally appeared on the Tivoli User Community boards on March 18, 2013, and is reprinted with permission of the author