Integrating Your Higher Education Marketing Systems: 4 Common Pitfalls

Integrating Student Information Systems with Inquiry Management Systems

I’ve been on both sides of the fence when it comes to developing systems both doing the work, and as a certified Project Management Professional (PMP), managing some great people who have been on my teams. 

Having lived and labored in both camps, I’ve unfortunately seen several projects run into issues.  To be more precise, and for the purposes of this post, I’m speaking specifically about projects that involve the integration of two different systems and some of the common pitfalls that occur during these projects. 

We’re lucky to live in an age where all sorts of systems integrate with one another – even Twitter and a cat feeder.

Currently, my role focuses on the integration of student inquiry management systems (IMS) with customer relationship management (CRM) systems or student information systems (SIS). 

Integrating systems means there is a communication going on between two disparate systems.  This communication can occur in one-direction (like an IMS system sending data to an analytics server for reporting purposes) or it can involve both systems generating and consuming data that is created by the other system (this is usually referred to as a bi-directional integration). 

My team and I have worked on both types of integrations in order for systems to exchange inquiry, analytics, and overall marketing and admissions data. And when integrating two systems there is only one thing for certain, “anything that can go wrong will go wrong.”

To help you avoid some of the most common issues, I’ve put together a list of four pitfalls that can easily be avoided.

Click Here to Learn More About GlassPanel: Inquiry Management Built for Higher Education


Pitfall #1: Not Properly Defining Scope

One of the biggest issues I see when working on integrations, and projects in general, is when the scope of what is being done is not adequately defined.

In short, we missed something that needed to be included during the scoping phase of the integration.   When we dig into the real reason why this happens, the cause usually falls into one of two buckets:

  • Either we didn’t do a good job of defining all of the details of the integration
  • We found out at some point that we didn’t fully understand some aspect of what was being built.

Getting ahead of these problems can be as simple as outlining a time to spend on scoping for any project that will be worked on, no matter how simple or complex you deem the effort to be and making sure that a scope document is created to capture everything that is called out during the scoping phase.  

By providing yourself with adequate time to outline you’ll also give yourself enough time to fully understand what you’re building.


Pitfall #2: Not Sticking to the Plan


Source: https://giphy.com/gifs/hannibal-liam-neeson-a-team-E8b8dWfw67rnq

So you’ve put time on the calendar for scoping and got all of the relevant teams together to understand what was needed. You’ve planned everything out and the team has an understanding of what’s needed in the integration.  Execution started and the integration is nearly ready for testing, but some early testing has revealed that something just isn’t right with the integration. 

So what happened and where did you go wrong? 

The odds are that if you scoped properly and put together the plan correctly, you just did not follow the plan so one of the following things happened:

  • Misunderstood requirements: The requirement was there in the plan and it was spot on, but the people building it didn’t understand it and built something else.This is a real issue when building any software, not just integrations between two systems, and it’s the reason why have a tighter feedback loop is important.  The team building the integration might not understand the business case for what is being asked for so something else gets built. In other cases, you may get instances where members of the team are confused and are just afraid to ask for clarification. It can actually happen to the smartest people – like NASA.  In September 1999, the Mars Climate Orbiter was lost because the software built to calculate the insertion into orbit used US units instead of the Metric units that were specified in the requirements (https://en.wikipedia.org/wiki/Mars_Climate_Orbiter#Cause_of_failure)

 

  • Failed execution: This failure is simply about building the integration and having it not work as expected. This can happen when assumptions are made on either the business logic level or even at the programming language level.  The easiest way to address this is test, test, and test again until the work is in line with what was planned.  This is covered further under Test Planning.

 

  • It’s a feature!: A feature or functionality was added to the integration that was not part of the original scope or the approved plan. Maybe someone wanted to have a validation that checks what zip code the student prospect came from or a call out to verify that the email address for the student prospect record is valid before being delivered to the SIS through the integration.  Whatever the reason, this is what’s frequently referred to as “scope creep”. Changes to scope can certainly happen and be accommodated, but if the execution of the work is in progress against an approved plan, these changes need to be carefully considered and the impacts of bringing them into the plan need to be assessed.  Even a small change that looks simple can add complexity, time, and cost to an integration.   If you consider how that might impact something like an upcoming term start for your institution, a delay of a week could potentially impact your enrollment goals.

Click Here to Learn More About GlassPanel: Inquiry Management Built for Higher Education

 

Pitfall #3: Missing Technical Details

Even for integrations that look very simple from a high-level business process standpoint, there can be a great deal of technical details that need to be defined in order for an integration that carries student marketing and admissions data between systems to work properly.

If this level of detail is missing from technical design documents, it can mean the difference between success and failure, so make sure that your teams are keeping good technical design documents in addition to the business process documents that outline how the integration works from a high level.


Pitfall #4: No Test Plan

Sometimes it can feel like defining and building an integration with your student information system or CRM is a major undertaking.  However, one of the biggest parts of the build is in the testing of the integration. 

Just like you plan how and what to build, it’s equally important to plan how to test what was built. 

Creating an explicit test plan for your integration will help to reduce uncertainty, ensure that what was planned/designed is what was actually built, and helps address issues earlier in the development cycle thereby reducing the cost associated with a defect. 

The test plan should include unit testing (automated tests created by developers to validate the system at a code-method level), integration tests (that check the data moving through the integration is valid when going from point A to point B), and end to end tests that validate the entire business process when the integration is in place.

While, this is by no means an exhaustive list of issues that can occur when integrating student prospect marketing and admissions systems together, addressing these issues at the start can help cut down on major headaches that can occur later on.

Inquiry Management Built for Higher Education

Steven Hamilton
AUTHOR

Steven Hamilton

I'm the Director of Product Management & Implementation Services at EducationDynamics. I enjoy learning about new technologies, writing about technology and spending time with my wife and two children. You may contact me directly at shamilton@educationdynamics.com

All stories by: Steven Hamilton
Shares