Modernizing Data Architecture for Real-Time Decisioning with SAS ID for a Financial Services Leader

Modernizing Data Architecture for Real-Time Decisioning with SAS ID for a Financial Services Leader

This blog post is the first of a three-part series that explores some of the major real-time decisioning Data, Offer Management, and Omnichannel Activation technical challenges encountered and solutions used when migrating marketing campaigns from SAS Real-Time Decision Manager (RTDM) to SAS Intelligent Decisioning (ID) by a major financial services client. While some of the content may be vendor or industry-specific, we hope you find it insightful and relevant in your daily work.

Modernizing Data Architecture for Real-Time Decisioning blog image

Services Provided: MarTech Modernization, Data Engineering, Testing & QA, Real-Time Marketing, Real-Time Decisioning Strategy, MarTech Modernization, Cloud-Hosted Migration, SAS Intelligent Decisioning Implementation, SAS Real-Time Decision Manager

Key Topics Covered

    • Migrating from SAS RTDM to SAS Intelligent Decisioning
    • Designing cloud data architecture for real-time decisioning
    • Validating decision logic without exposing customer PII
    • Optimizing legacy decision rules for modern data platforms

The Challenge

The client, a leading financial services provider, needed to upgrade the marketing orchestrator or the “brain” behind their real-time marketing campaigns, transitioning from their legacy SAS Real-Time Decision Manager (RTDM) instance to the modern, cloud-hosted SAS Intelligent Decisioning (ID) solution using a multi-phased migration plan. Real-time marketing led to generating incremental revenue of over one hundred million dollars, which made it a key strategic decision for our client and imperative to move forward with a trusted professional services partner.

In any major marketing technology upgrade, the biggest hurdle is rarely the software itself; the data preparation is the underlying prerequisite – where the data is stored to how it is structured. The client’s customer data lived in a secure, on-premise IBM DB2 database, and with over 200 complex marketing campaigns to migrate to SAS ID which leverages this data, our team faced three major challenges:

    1. The “Chatty Connection” Delay:
      Connecting a new AWS cloud-hosted instance of SAS ID directly to the existing on-premise database using standard methods (e.g., a JDBC connection) was too slow to transfer data, especially since it still had to be used for the existing SAS RTDM campaigns. This created a bottleneck, requiring the systems to constantly ping each other back and forth over the network. This delay threatened to exceed various overnight time limits which were used to calculate customer’s offers, so this was not acceptable.
    2. Testing “In the Dark” (Strict Privacy):
      Because customer PII (Personally Identifiable Information) including highly sensitive financial data was used, copying the production data into safe testing areas (often called “lower environments”) was not permitted. This meant the Munvo team would have to write and test new code without being able to look at the actual data, posing significant risk to the marketing campaign migration.
    3. The (Programming) Language Barrier:
      The old system’s marketing rules were written in outdated, highly specific coding languages (SAS DS2 and Groovy). The new cloud environment required a modern, distributed language (SparkSQL and Python). We couldn’t just copy and paste; everything had to be translated and completely re-engineered.

The Munvo Solution

To overcome these roadblocks, our client and Munvo engineered a modern data pipeline that prioritized security, speed, and architectural flexibility.

Modernizing Data Architecture for Real-Time Decisioning blog image
  1. Building a Replication Loop to fix the “Chatty Connection” delay:
    Instead of forcing the AWS cloud instance to constantly ping and ask the on-premise database for information row-by-row or customer-by-customer, we built a bulk-transfer system.
    • Ingest: We securely copied the necessary daily data into cloud storage folders (AWS S3).
    • Process: We used cloud computing engines (AWS Glue) to process the data all at once, taking the heavy lifting off the client’s internal servers.
    • Consolidate: Finally, we saved the results in a highly compressed, efficient file format (Parquet) and sent it back to the main database.
      This replication method completely bypassed the existing network bottleneck while ensuring additional delays were not added to the overall process.
  2. Developing a creative testing strategy without “Test Data”:
    In most campaigns, a high match rate (99%+) between the number of customers (and which ones exactly) in Production versus testing in non-Production was required for the client to sign-off on the migrated campaign. Since mock or “fake” test data could not be used as it would not be the same as data in Production, Munvo built a unique validation process where scripts were written to compare the results of our new cloud code directly against the old system’s historical records. For live campaigns where history wasn’t available, we built a tool to digitally “call” the old system’s rulebook (i.e., its API) in real-time.

    If our new system and the old system returned the exact same customer offer, we knew the logic was accurate, without ever exposing private customer data. This provided for an apples-for-apples testing approach using the same API-driven process as if with real customers in real-time, further cementing the test results.
  3. The “Triple Jump” code optimization for the (Programming) Language Barrier:
    We didn’t just act as translators; we acted as optimization experts. We 1) took the old legacy code, 2) extracted the core business goals, and 3) rewrote it into platform-agnostic SQL. We then optimized it specifically for the AWS cloud (SparkSQL). This triple jump ensured that scanning billions of customer records would happen in minutes, rather than hours.

Key Benefits

green crescent check mark

Eliminated Network Delays: By moving the heavy data processing to the cloud (AWS) rather than relying on direct, slow connections to the on-premise database, the client achieved significantly faster processing times and lowered their operational and infrastructure costs.

green crescent check mark

Zero-Risk Deployment: Despite working under strict privacy rules with no test data, Munvo’s creative validation tools resulted in a flawless launch. We successfully reduced technical defects to zero by the final month of the project, ensuring the new system went live with 100% accuracy.

green crescent check mark

Future-Proofed Marketing Logic: Migrating away from older, proprietary code into standardized SQL meant that the client’s campaign logic is now flexible. It can easily be understood by modern developers and integrated with other new data tools in the future.

green crescent check mark

Empowered the Internal Team: By removing the strain on their on-premise servers and cleaning up the underlying code, the client’s marketing operations team can now focus on building new, personalized campaigns rather than troubleshooting slow data queries.

Key Takeaway

Modernizing real-time decisioning requires more than a platform upgrade. By redesigning the data pipeline, validating campaign logic safely, and translating legacy decision code, organizations can successfully migrate to SAS Intelligent Decisioning while improving performance and scalability.

Thank you for reading this first blog post on real-time decisioning’s Data challenges. Next in the series we will explore Offer Management challenges for real-time campaigns. Stay tuned!

Is Your Real-Time Decisioning Architecture Ready for Modern Marketing?

Contact Munvo to assess your current approach and identify practical next steps.

Sales Inquiries + 1 (514) 223 3648
General Inquiries + 1 (514) 392 9822
sales@munvo.com

© 2026 Munvo is a trademark of Munvo Solutions Inc.


Munvo | The Marketing Solutions Specialists
Search