It goes without saying – data is critical for providing organizations with important insights that can help them deliver better customer experiences and business outcomes. But securely, quickly and cost-effectively tapping into those insights isn’t necessarily easy or cheap.
Data continues to grow exponentially, making data collection, processing, and storage complex and costly. Ever-changing compliance mandates require strict security and access protocols and processes. And data only delivers value if the right tools are in place to extract and visualize its insights, monitor business performance, and support decision making.
For one company, the solution was to go with an AWS cloud-based solution architected by Opti9. Doing so not only enabled the organization to overcome the challenges of data lifecycle management and maximize the value of its data. It’s enabling the company to provide its own customers with technology solutions that deliver insights faster and easier as well.
From smart room integrations to digital displays, Sonifi delivers technology solutions to enhance the customer experience across a variety of industries. For the healthcare market, those solutions include interactive, in-room TV systems that provide educational and entertainment-based content.
To enable its own customers — children’s hospitals, cancer centers, long-term care facilities, and others — to optimize the programming provided and deliver more personalized content, Sonifi collects data, such as event logs. Its customers can use this information to gain valuable information in terms of consumer behavior and preferences to enhance service offerings.
The problem was that content and video interactions generate huge amounts of data. Sonifi had approximately 5,000 headends (cable television control systems) deployed that were pushing the equivalent of 200,000,000 individual records and 35 GB data daily to its corporate systems. The data was being transmitted at varying intervals based on type and importance, and in a wide variety of formats.
Sonifi was using a legacy distributed data model to collect, store and process the data, but it was inefficient and suffered from latency and other issues. With the massive growth of its data, the company needed a more scalable, durable, secure, resilient, and cost-effective solution.
The Customer Requirements
Specifically, Sonifi wanted a data lake that could ingest 35 to 60 GB of data daily, maintain 400 days of data, and properly manage the data lifecycle. The overall solution would also need to meet a variety of security requirements.
In addition, the company wanted a cost-effective data warehouse. It had to be architected to allow Sonifi’s customers to securely access their own raw data and use their own analytics and visualization tools such as Looker, Tableau, and others. Meanwhile, Sonifi needed the ability to query any combination of the ingested data. Data pipelines would be required to automate the ingestion and movement of the data.
Because Sonifi was already using AWS to support one of its primary applications, the company wanted the new data management solution to use that platform.
As an AWS Advanced Consulting Partner with extensive experience in data lake and data warehouse architecture, Opti9 was well suited for the job.
The project entailed a number of challenges, including:
There were also many unknowns for the Opti9 team, as well as a variety of concerns that had to be addressed before it could create the optimal solution. This required the team to think holistically and employ an approach that would include running a proof-of-concept environment in only a three-month period to meet the customer’s schedule requirements.
The Opti9 Approach
Fortunately, Opti9 employs well-defined standards to help organizations embrace AWS cloud technologies, and a team of talented, certified engineers with a proven track record of developing successful solution architecture.
After gathering information about Sonifi’s requirements and needs, the Opti9 team conducted Jumpstart™, in which the team employs a set of resources to develop a secure, organized footprint in the AWS cloud. It’s designed to enable customers to quickly get up and running in an AWS environment while following AWS best practices.
A proof-of-concept followed, allowing the team to evaluate different AWS services at all layers of the solution. Finally, a cost analysis was conducted to ensure the solution would be cost-effective.
The team explored various options, including the design of several ETL (Extract, Transform and Load) processes using AWS Glue, an immensely powerful tool for ETL. However, the analysis of that approach determined it was too expensive for what needed to be achieved. That led the team to create a more cost-effective solution.
Briefly, the solution entails ingesting 35 – 40 GB of semi-structured data in CSV files in batches every 15 minutes. Today, the semi-structured data in CSV files is ingested from a DRPS system hosted on Sonifi’s corporate data center.
The solution also uses AWS Lambda, a serverless compute service. Code is run only when needed and scales automatically. Users just pay for the compute time consumed. When the CSV file arrives in the S3 bucket, a lambda function launches. This triggers a data pipeline to automatically move it to an Amazon Redshift cluster.
Redshift serves as the data warehouse layer, where data is ready to be analyzed. Query results are delivered at high speeds using standard SQL, and can be saved back to the S3 data lake using open formats.
At the same time, Oracle online transaction processing (OLTP) databases are loaded and constantly replicated in real-time to into online analytical processing (OLAP) Amazon Redshift tables using AWS Data Migration Service.
AWS Data Pipeline reliably processes and moves data between the compute and storage services, as well as on-premises data sources, at specified intervals. It helps easily create complex data processing workloads that are fault-tolerant, repeatable, and highly available. Amazon CloudWatch monitors the data pipelines in real-time and collects and tracks metrics.
Finally, the Redshift cluster is synced up with third-party visualization tools such as Tableau and Looker.
The project successfully met Sonifi’s goals and technical requirements, yielding a cost-effective solution for gathering and processing data from hundreds of sources and in numerous formats.
Business analysts, data scientists, and decision-makers at Sonifi can access the data through business intelligence (BI) tools, SQL clients, and other analytics applications to generate quick data insights. Sonifi’s customers also have secure access to their raw data and can use their own BI tools to extract essential information.
Opti9 is continuing to work with Sonifi to extend the solution to meet the needs of its customers in other industries.
What It All Means for You
While the ability to quickly and cost-effectively gather, process and analyze data is highly useful for just about every business, each one is unique. Every organization has its own specific goals, needs, and technical requirements. As Opti9’ work with Sonifi illustrates, the right combination of experience and expertise can overcome just about any challenge and generate an optimal solution.
If you’re interested in learning what Opti9 can do for you, let us know. Our solution engineers will be happy to discuss your needs and how our processes, which are always customized to your specific requirements, can help meet them.