Tuesday, July 24, 2012

Catalyst System Management Provides Centralized Management

Chances are that you reading this article via your favorite RSS reader, in my case, Flipboard. Or perhaps you were pulled in via LinkedIn or Twitter. No matter the exact method, there is a common thread among all of these; simply put, you wanted to avoid manually checking all or your favorite websites independently. Like many users, as you come across something interesting, you add it to your favorite aggregator - you did add us right? Once you are subscribed (or following), the magic happens; you can read content from hundreds of blogs and sites from a single interface. Think of the time that you save by having all of that data pulled to you – no need to visit them separately.

So, why I am highlighting some of points of social media (besides a shameless plug – again, you are following or subscribing, right)? The recent release of Savigent’s Catalyst Platform™ offers several benefits – one that I would like to highlight is the addition of Catalyst System Management™ to the Platform. This tool allows you to browse, monitor and maintain your Catalyst Platform™ in one central place. Think of your assets (or nodes) on the plant floor, as websites/blogs, you might have hundreds… System Management allows to you to keep in touch, without typing URLs each time or physically going to the asset.

That is just the beginning, unlike a simple RSS feed, Catalyst System Management™ allows for interaction with the assets on the Platform. You can configure node data, push updates of code/configuration data, and monitor system health in real time. Think of the time that you will save in managing your system… But a word of caution, similar to using a RSS reader, you might just find yourself getting addicted to the information!

Thursday, July 19, 2012

Bull Durham and Catalyst Platform: Composite Applications

This summer is the 25th anniversary of one of my all-time favorite movies. Ron Shelton’s Bull Durham is the classic story of minor league baseball player, “Crash” Davis, an aging veteran who is sent to the minor leagues to help train a flashy young pitcher, Ebby Calvin “Nuke” LaLoosh” in the ways of professional baseball. In the process, he meets and falls in love with a die hard baseball fan, Annie Savoy; who, unfortunately for “Crash” has already taken a romantic interest in young “Nuke”.

While there are number of memorable scenes, one of the best involves Annie discussing her beliefs on love. In educating both “Crash” and “Nuke” on her beliefs, she talks about how matters of the heart are really out of our control and more a matter of quantum physics. As “Crash” rises to leave, stating that after all of his time in minors he doesn’t believe in trying out; Annie asks, “Well, what do you believe in?”

After “Crash” replies with a soliloquy that covers the designated hitter, the JFK Assassination, good scotch and “deep, slow wet kisses that last three days”, “Nuke” replies, “Hey Annie – what’s all this molecule stuff?”

So, what does this have to with Savigent’s recent release of
Catalyst Platform? While this release provides a number of features, many of the features appear, on the surface, to offer only technical benefit. While my co-authors will address the benefits of these features in future posts, some users may be a little like “Nuke” in the scene mentioned above asking, “What’s all this technical stuff?”

In an effort to demonstrate that there’s more than just technical benefit to our new release, I want to highlight a feature that can deliver immediate business results – composite applications. Simply, a composite application is an “application” built of other applications. As “mash-ups” of the service-oriented development world, composite applications are built using re-usable components that can quickly be rearranged in response to changing business priorities and conditions. The ability to build composite applications and reuse components across them within Catalyst Platform will allow our clients to build architectures that are flexible, adaptable and extensible. More importantly, when coupled with an iterative development approach, composite applications built on the Catalyst Platform will allow customers to develop applications more quickly leading to a faster return on their investment.

Given the dynamic nature of today’s manufacturing environment, leading software providers have to provide tools that facilitate faster, more scalable development. Catalyst Platform creates the foundation to meet this challenge. That’s what I believe – in case Annie asks…

Tuesday, July 17, 2012

Savigent’s Catalyst Platform v4.0 Lays Foundation for Migration to Windows Azure

Includes new Catalyst Bus™ with real-time information access via OData services
MINNETONKA, Minn. – July 17, 2012 – Savigent Software, Inc., the Minnetonka-based company specializing in event-driven manufacturing operations management software, announced today the release of Catalyst Platform™ version 4.0. The updated version delivers significant enhancements to the software, most notably a fully managed infrastructure for the development, deployment and management of composite applications. Catalyst Platform™ also provides a central repository for revision controlled management of composite applications and a new, highly scalable Catalyst Bus™ with real-time information access via OData services.

Michael Feldman, company CTO, said, “The release of Catalyst Platform™ lays a foundation for the migration of the entire Catalyst suite to Windows Azure and provides significant new functionality that supports enterprise-wide development, deployment and management of composite applications in data centers, and private and public clouds.”

Catalyst Platform™ is the foundation of Savigent’s expanding Catalyst™ suite of software products. The software dramatically simplifies the development and management of highly scalable service-oriented software solutions in the manufacturing environment. It combines three powerful capabilities into one software product: a composite application development framework, a unifying service architecture and a managed execution environment.

The enhancements present in Catalyst Platform™ version 4.0 provide users with significantly increased functionality, flexibility and simplicity. Catalyst Platform™ provides a central repository for revision controlled management of composite applications and their configuration, and managed deployment to the Catalyst environment for execution. Catalyst Bus™ extends the Service Oriented Architecture of Catalyst Platform™ by implementing standardized service patterns for commonly used interactions in the environment. Data within the Catalyst Bus™ is available in real-time via OData services. Catalyst Development Studio™ provides a visually intuitive environment for the assembly of composite applications from highly configurable, prebuilt agents.

Jay Mellen, Savigent’s executive vice president of business development, remarked, “This is a very exciting release with loads of new, differentiating functionality. For example, Catalyst Bus™ dramatically simplifies service implementations and it is OData accessible, so domain experts and IT professionals can access information quickly and easily using a wide variety of applications tailored to meet their needs. And we are continuing to keep our entire product line on the leading edge of Microsoft technologies, with support for Microsoft .NET 4.0, SQL Server 2012, SharePoint 2010, Windows Server 2008 R2 and Windows Azure.”

With this the release of Catalyst Platform
, Savigent is rebranding its entire Catalyst suite of software products. Catalyst Workflow™ delivers a controlled system for workflow automation, providing manufacturers with guaranteed compliance, unparalleled traceability and rich manufacturing intelligence. Catalyst Historian™ provides manufacturers with a real-time, context-aware data historian, in addition to a comprehensive data analysis tool.

Founded in 1994, Savigent Software has pioneered a new class of event-driven manufacturing operations management software. The company currently serves manufacturers in a variety of industries including automotive, semiconductor, industrial, specialty chemical, consumer packaged goods, and aerospace and defense. Customers served by Savigent are seeking increased efficiencies, agile control of manufacturing assets, and improved process control and product quality. The company also serves OEMs and independent software vendors by providing value-added software solutions for their products. Its Catalyst™ suite of products provides solutions for workflow automation, manufacturing intelligence and systems integration.

More information about Savigent Software can be found at savigent.com. Read about manufacturing operations management on our Level3 blog at blog.savigent.com. Follow us on Twitter at twitter.com/savigent.

Tuesday, July 3, 2012

Workflow Automation provides a “Business Process Historian”

In our interactions with prospective customers, we often see them struggling to answer seemingly simple questions like:
  • What material lot was used in this batch?
  • When did we detect this issue?
  • Why was this decision made?
  • Who postponed the maintenance on the equipment?

In a manufacturing enterprise, information exists in a variety of systems (ERP, MES, LIMS, QMS, WMS, SCADA and IO devices, etc.), each an island of information, that need to interface and communicate with each other as part of the intricate orchestration of process execution. Each island is a System of Record (SOR) – an information system which is the authoritative data source for a given data element or piece of information. ISA-95 states that activities of manufacturing operations management are those activities of a manufacturing facility that coordinate the personnel, equipment, material, and energy in the conversion of raw materials and/or parts into products.

As manufacturing business processes execute, data from various SORs is used and/or changed. Then, why is it so difficult to answer the questions above when all the data required exists in SORs? The reasons are not that obvious.

First off, many of the standard operating procedures (SOPs) in the manufacturing environment are manual in nature (people driven), and a majority of them are performed on an as needed basis. Since the SOPs are executed manually, understanding a detailed account of what happened, when it happened, and why it happened typically involves, at best, paper records, and at worst meetings with people discussing what they recall happing. Unfortunately, most people look at SORs as individual systems that are eventually updated through manual data entry or, in some cases, through automated data flows using variety of technics (i.e. file transfers or custom code extensions of the individual SOR). I am not going to address the problem of Band-Aid / spaghetti code integration in this blog posting; it is a topic on its own.

The second problem that prevents people from answering the questions is that they don’t have access to the transient history of process execution that occurs outside of these systems. In other words – these systems don't store the information change history needed to understand lineage of causes and effects required to understand what, when and why. This information is essential to driving continuous process improvement within the manufacturing enterprise. Applications like process historians, if used to capture information, typically only includes time context and are unable to handle complex and transactional data from SORs like ERP and MES. Likewise, ERP systems typically lack detailed execution information.

As an example, ask yourself how you would answer the following question - what production request was first to process after equipment maintenance was performed on a specific piece of equipment? In most cases the answer we get is that someone has to know how to query records related to the equipment from the Maintenance Management system, and from the time-stamp of the record try to find the production request that was processed around that time. Time, again, being the only relevant context available to link information between SORs.

What is missing is one very important element of context of information – its “lineage” or ordered history of information changes. Lineage adds the essential context that relates instances of execution with a timeline of events and activities. Lineage also provides information about the duration of process execution, be it related to equipment, material transformations, system performance, or people’s responsiveness. From a continuous improvement perspective, lineage provides in situ metrology for the performance of SOPs and for all participants in the SOP.

Our preferred solution that solves the lineage problem is to automate SOPs using Catalyst workflows – because workflow automation provides a “business process historian”. To enable high reliability and fault tolerance, the Catalyst Workflow Engine uses a transactional data store that records all of the execution data related to a workflow as well as all data that is passed in and out of a workflow to/from connected SORs (systems, equipment and people). Workflow automation provides a system that contains a complete lineage and therefore provides visibility across all SORs participating in the execution of SOPs.

Revisiting the example, consider the alternative workflow-based approach for SOPs that will capture data upon:

  • Interaction within any SOR (a request to make product was entered),
  • Presentation of a task to any person (an operator was told to go make product),
  • Communication with any asset (the equipment in the workstation experienced a state changed to running and it is now making product).

The execution history of the workflow(s) provides the lineage required to quickly and easily answer the question posed. But perhaps more important, the complete business process history is known. Because all data related to the execution of workflows is accessible, the information required to manage SOP performance exists. This provides additional insight into questions like:

  • How long did it take for the operator to actually go and make the product? (uncovering unnecessary delays which is wasted time)
  • How long did it actually take to make the product? (providing true cycle time for the product within the workstation)
  • How much idle equipment time was experienced in the process? (providing true equipment utilization on productive tasks)

None of this information typically exists within an individual SOR, and without workflow automation, we are often left with an incomplete and incomprehensible view of the past. So with workflow automation as a Business Process Historian, not only can we answer the question posed, but we can support continuous improvement efforts with detailed data about the performance of SOPs.