Is Your Architecture Truly Open?
Enterprise system architecture can be evaluated from many different perspectives. Similar to conventional building architecture, a solid system design must consider several different criteria to maximize the pragmatic features of the engineered construction. When we talk about Enterprise Architecture, several criteria are commonly used to gauge the success or failure of the architecture:
- Ease of management
- Ease of integration
- Response to failure
- Reporting and analytics for the above
Among the various best practices is the question raised in the title around open architectures.
Why does open architecture matter?
When engineering an enterprise solution architecture, many criteria matter. Open architecture matters because practically every enterprise solution must meet the non-functional requirement of co-existing in an established technical ecosystem. Enterprise systems must interact with each other to accomplish business tasks and these systems may be provided by separate vendors, built in-house, or rely on 3rd party APIs as a layer for interfacing with various web services and systems.
This article highlights the benefits of truly open and relevant standards as related to enterprise architecture.
Some key terms
Open architecture differs from open source. Open source software involves sharing raw source code to facilitate crowdsharing benefits in building out the code. Open architecture is focused on easily decoupling data from the proprietary layers of the code, so that data can easily be transferred with other systems and business logic. It also has to do with the architecture being easily and highly extensible on the back end, so that any front end (user experience or integration to other applications) can be applied through a robust, fully-featured API. System to system communications must be easy, reliable, and efficient.
Even when an architecture is implemented by proprietary technologies, we still consider it open if the architecture is able to facilitate data transfer in and out of the system and support data transfers across systems. In addition to direct API support, this includes access to system functionality via meaningful layers of abstraction to serve as the glue between business rules and other enterprise systems.
Best practices for enterprise system architecture
Quark Publishing Platform is an Enterprise Java Web Application built on the open source Spring Framework. That means it’s scalable, secure, incredibly extensible, and easily adopted by IT departments with sophisticated and challenging requirements. We support major investment banks with their extremely complex IT requirements, as well as small 20-person shops that just need “out of the box” to work well.
Extensibility is core to every product we build and every enhancement we make:
- We support adding a custom service which can automatically be exposed through our REST interface to rebuilding the entire web-based user experience for a custom business portal using nothing but our APIs.
- We support many server-side Java integration points (such as JMS), as well as our robust RESTful interfaces which expose API access to the features a customer needs (and we use the same APIs ourselves for our cross-product integration).
- When there is a gap in the API based on a new customer requirement, we can address it very quickly. In response to customers’ requests, we have added new content markup in Quark Author to drive new publishing features in QuarkXPress Server – all in a single development cycle. This was true for “regions” in Quark Author and in our next release cycle (due in September) we’ll be adding two more: Index term markup and Index sort/format publishing.
- In the content design space, Quark – via QuarkXPress – invented the use of an SDK for 3rd parties to extend the application in ways that were extremely useful. We even created a marketplace for eXtensions, as they’re known, that at one time was larger than many software companies in their entirety. Other desktop design software vendors directly duplicated that model in their products. The eXtensions model was certainly the first such commercial retail “add-in” marketplace in the 1990s that current Quark CTO Dave White was aware of – long before he was directly involved with Quark.
A tremendous amount of Quark capabilities come from our integration with XML:
- Authoring tools content models
- Most of our products’ configuration files
- REST API posts and responses (also available in robust, modern JSON)
- Even our QuarkXPress Modifier format for automated publishing
It’s a common best practice to avoid attaching your data to a proprietary system or application. At a recent professional conference, Eliot Kimber humorously compared some highly proprietary content management systems to roach hotels: “content checks in but it doesn’t check out.”
At the simplest level, an enterprise component content management system architecture must provide rich methods to extract data. While this is commonly available in one-off operations, it’s important for enterprise systems to consider orders of magnitude when it comes to the execution of any single task. In other words, extracting a single asset or collection of assets from your system is only the beginning. Enterprise applications often need to import or extract data based on a number of factors including business process, queries based on traceability or auditing, or when making global transformations to data entering or existing an enterprise system. Doing it once is part of the answer, but doing it at scale is often required by customers with requirements for automation during multiple steps of the content life cycle.
How is this done? It depends on the system, but a common best practice is to provide import/export features to various data formats or even .zip archives representing a data dump of variable scope. It’s even better when these transactions can be managed by REST-based calls which may be invoked programmatically. Better still, rich REST-based APIs should provide a headless and efficient means of extracting all assets, including every version and all metadata for each asset, without introducing proprietary structures that interfere with the usefulness of the extract. These mechanisms do very little good if the customer’s original data structure is changed or rendered useless. For example, if references use a proprietary model instead of a standards-based approach like a URI pattern. Yes, this still happens.
Every elegant design should seek to simplify the most common tasks for end users, but also consider the administrative needs required to properly care and feed the system. It’s the rare design that also takes into consideration these additional unforeseen use cases and exposes additional layers of tooling to simplify configuration without high-cost services. Every system architecture should also consider its own end of life and the necessity to interact across initially incompatible systems in the enterprise ecosystem.
How current is your understanding of the solution architecture?
From time to time, technologists conduct due diligence and feasibility analysis to help them arrive at decisions on acquiring new/replacement products or services. More often, such research is based on some quick Internet searches. To remain relevant, the best technologists will read as much as they can every day. They will also go beyond the research to get hands on experience with those same products and services on a periodic basis to better understand exactly how the offerings have changed and validate or invalidate their understanding. Sometimes the technology changes significantly and in very good ways. At other times, the review is nothing more than the regurgitation of some fluffy buzz words used as click bait and the technology has actually stagnated. How will you know the difference? For example, one vendor might state “A robust API for integration with any other system,” which sounds pretty good. But what if you learned that the vendor only offered their API technology in an older standard called CORBA, widely considered a dead technology since 2004? Understanding how to ask the right questions is crucial to making good decisions.
Change is abundant, increasing in frequency, and far-reaching in scope. Technologies come and go, and yet it’s common for many business not to see a return on technology investment for 3 to 5 years. Therefore, the solution architecture must be robust enough to withstand and embrace the inevitable changes. These may be impossible to predict in every case. If the system is to remain relevant and deliver its value over the system’s life cycle, this additional layer of research is invaluable. Almost every vendor is willing to give a demo or build a snazzy web site, but how many of them will stand up a live proof of concept solution and let you play with it to test it out against your actual requirements versus the filtered language that appears in an RFP or its response? And how much of your RFP is devoted to ensure that reasonable architectural requirements are identified?
Open means easy integration
Easy integration starts with proven architectural frameworks for web-based applications. Those frameworks are constantly evolving and changing, so any solid architecture will build out further tooling, back-end improvements, 3rd party technology partnerships, and flexibility for the continuously evolving front-end frameworks as well.
Let’s look at another example of easy integration. Many organizations manage assets using proprietary formats and aren’t ready to make the full transition to XML across the enterprise. That’s why the Quark Publishing Platform also supports managing InDesign documents and components (though we don’t provide any automation of InDesign documents). Otherwise, InDesign is treated as a first class application and content type for those that need it. Similarly, MS Office documents have robust support and some, such as Excel, PowerPoint, and Visio can be a source of reusable components in XML. Of course, Platform also supports reusable components for QuarkXPress projects, Quark XTensions, and QuarkXPress server publishing channels to assemble reusable pipelines for omni-channel publishing and delivery to HTML5, ePub, App Studio, PDF, 3rd party ECMs like FileNet and SharePoint, and more.
Why proprietary can be good
Quark owns all of the major technology in our enterprise system architecture that provides the business value for content automation. One of the strengths of having a broad solution stack is that we can manage and synchronously release updates and enhancements according to our schedule and prioritization without having to wait for a 3rd party to decide if they agree, shall prioritize, or deliver in a timely manner.
Of course, we still integrate several components into our system that do rely on 3rd parties. It makes perfect sense in many cases, so we carefully evaluate and select those open-source and proprietary partners who work with us to provide the best value for our customers at a reasonable cost. One significant driving factor is the underlying technical architecture and how responsive these 3rd party vendors can be to us. Just like our customers expect high quality and tight turnaround for fixes and features, we expect the same of our partners and appreciate when we have a strong rapport based on results.
Even proprietary technologies can and must play more nicely together. The underlying Quark technical infrastructure has recently captured some attention for supporting other proprietary formats:
A key measure for proving an architecture starts with asking the right questions. A truly open architecture is not present if it is dominated by proprietary software that simply imports/exports formats from one product to another. We must dig deeper to examine the underlying technical landscape.
Here are some questions worth asking:
- If key proprietary components are removed or replaced, does the architecture still hold together?
- How well can the architecture be extended by standard web-based technologies and modern development frameworks?
- If your solution is focused on content automation for the creation, management, publishing, delivery, and analytics of business-critical content, how easy is it to mix and match authoring tools for the content?
- How many different types of content can truly be managed as components which can be assembled for publishing, reviewed and approved, and ultimately delivered to various formats using reusable channels?
- How easily can you replace the publishing engine used to render various output formats?
- At the database layer, how many different databases are supported?
- Do your requirements force you into a relational database model?
- Do your requirements include supporting a native XML database and why?
- How well can your data tier scale horizontally and vertically for increases in assets and transactions by orders of magnitude?
- How do you know?
- Do you have benchmarks or test cases which help add quantitative analysis to the discussion?
- How responsive is a vendor to your needs as a customer and how well can that vendor address end-to-end enhancements at the velocity of business?
- Does the vendor have more than one or two reference customers in production for over a year who can back up their claims?