[by ALTR] ALTR and Snowflake make data governance in financial services quick and easy

[by ALTR] ALTR and Snowflake make data governance in financial services quick and easy
[by ALTR] Take the first step into the cloud with your enterprise data warehouse
Credit unions have long been valued for their customer-centric approach and personalized service. They strive to create a seamless member experience by understanding their members' needs and preferences. However, in today's digital age, understanding members' behavior and preferences requires much more than just personal interactions. It calls for collecting and analyzing data to gain insights into members' behavior and preferences.
How to begin, and what are best practices to follow? That's where this Member 360 reference data model comes into play. This model is designed to provide a comprehensive view of members by combining all available data points into one unified view.
This post assumes you already have a data architecture strategy in place or in planning. If not, refer to an earlier article in this series on a reference architecture.
Or see a higher-level overview of reasons to focus on Member 360 data analytics
The goals are to better understand member behavior and preferences, improve the quality of service, and create more personalized offerings.
In this post, we'll take a closer look at how to design a data model in your data warehouse that is suited for member analytics and reporting. Then, I will show some examples of designs for specific subject areas.
At Datateer, we follow the Simpler Analytics framework for delivering reporting and analytics. Let’s lay down some definitions so that we are using the same terminology:
Our experience at Datateer has shown following the Kimball design methodology to be superior to other approaches for data analytics. Star schemas, with fact and associated dimensions, have benefits over other methods we have evaluated and implemented.
Although not new–and not without detractors–this design method has crossed the chasm and proven itself in modern data analytics. It is acceptably fast even with large data volumes and is the easiest to understand for analysts, data scientists, and business users.
To follow this methodology, design fact tables that reference dimension tables.
Dimension tables should be reused across fact tables. Using the member dimension as an example–any dimensional analysis on the member dimension should refer to the same “dim_members” table.
In organizations large and small, consistency of terminology and in definitions of key entities is critical. If multiple departments have slightly different definitions of what “member” technically means, this principle of conformed dimensions will force that conversation to the surface.
Successful data analytics efforts force the issue and come to a consensus of what “member” means in the organization. Organizations that allow multiple definitions of “member” in centralized analytics are less successful.
Many-to-many relationships provide proper normalization in a transactional application database, but have no place in an analytical data model. They overcomplicate the model, making it difficult to understand and maintain. Also, they negatively impact performance by requiring more joins than necessary
Breaking down an analytical data model into subject areas aligned with business objectives brings focus and better understanding.
Each subject area tracks measurements specific to the analyses that it needs to support. These are independent fact tables.
Many dimensions, however, can be shared across subject areas. The best example of this is the member dimension, which relates to all these subject areas in a Member 360 effort.
These examples are based on Datateer’s experience in customer/member 360 analytics, and from data models that power Datateer’s Instant Analytics. Even though your situation is unique, my goal is to provide a solid starting point you can use to apply to your own analytical data models.
Each of these examples incorporates the member dimension, allowing analysis by member segment.
Credit scores are an important input to underwriting and risk analyses. In the example below, we track credit scores in a fact table named fct_credit_scores. The dimensions we apply are date, member, location, and credit agency.
This allows answers to different types of analysis questions around credit scores and history:
Recording transaction history allows analyzing member behavior by specific segment. Depending on the granularity of the questions that are part of the analysis, this fact table can be a daily summary, or it can track individual transactions.
The fact table in the middle measures transaction amounts and daily balances. The dimensions date, member, and account allow trending analysis by member, account, and segment.
I added a reference from the account dimension table to the member table, for convenience in filtering. A slightly more advanced–and more pure–pattern would be to have a member-account dimension that combines member and account.
Questions that this star schema can answer include:
Net Promoter Score is a common way to measure overall member satisfaction. With two dimensions–member and date–surrounding a fact table in the middle to measure scores, you get a star schema like this:
Even a simple star schema like this allows you to answer a variety of questions. Here are some examples:
The examples above illustrate good analyses, but they are a bit naïve. Members can belong to multiple segments! A simple but effective way to model this is to add columns to represent segments the member belongs to.
Although seemingly simple, this approach provides the same dimensional analysis capabilities as more advanced approaches, including grouping, sorting, and filtering. This approach is manageable up to about 10 segments, and is a good place to start.
For a more advanced pattern, you can either use a bridge table to a separate member-segment dimension, a bridge from the member dimension, or a many-to-many join table to a segment dimension. Here is an example of a bridge table to a member-segment dimension table.
In the real world, a member’s attributes can change. By tracking a dimension’s history, the data model will continue to support historical or “lookback” analysis.
For example, the diagram below tracks the history of a member’s main branch. The dimension becomes a view or table that only contains the most recent data from the member history table.
Any analyses that need historical data about the member’s prior branch can use the member history table to access that historical data.
Designing a flexible data model is an iterative approach allowing you to have KPIs and metrics specific to your credit union's unique needs.
Collecting and analyzing data is the basis of a credit union member 360. Analyzing members’ behaviors and needs is not new. And modern technology and techniques make it more achievable than ever before. By doing so, credit unions can gain insights into how to improve their products and services and deliver a better member experience.
In the midst of trying to justify digital transformation, cloud adoption, and modern analytics, powerful examples can help. Member 360 shows clear ROI and can be a great anchor in building a business case.
In this post, I’ll share some real-world examples of how credit unions have seen successful results from applying Member 360.
Navy Federal Credit Union recently launched a Member Strategy Office, combining survey and other primary source data, member behavior data, and user experience prototyping. The stated goal is to get into members’ heads for better strategic decision making. Underlying all of that is a member 360 data warehouse.
Not every credit union has the resources for such a large effort. But every credit union does have a lot of member data scattered across the organization in different siloes.
Pulling this together opens up the ability to segment and understand behavior within each segment. Understanding member segments’ needs can inform you where to focus, just like NFCU uses its Member Strategy Office.
It can cost $700 in marketing costs to acquire a single new member. This does not count new member onboarding hard costs and time spent with them. And this is for larger CUs–typically, the smaller the credit union, the higher the acquisition cost.
Lifetime value will be unique for your members, but consider $20,000 to be typical. Lifetime value is directly influenced by: acquisition costs and longevity of the relationship with members.
Using data on member behavior, credit unions can predict which members are most likely to cancel services. Targeted campaigns and loyalty programs targeted at specific segments increase retention.
Marketing research consistently shows selling to existing members is much less expensive and much more profitable than acquiring new customers.
The other side of the coin from simple retention is increased wallet share. The same techniques to avoid churn can be used to increase the number of services or increase usage of core services.
Cross selling has been around for a long time in credit unions. What about more targeted cross selling? Narrow segmentation and targeting are available from a centralized Member 360. By understanding the signals and behavior, basic demographics, and product usage, you can define very segmented targets appropriate for targeted cross selling.
Due to 2023 events like bank runs and failures, consumer trust in banks is low. Credit unions are enjoying a boon as new members seek the trustworthiness and personalized service of credit unions.
But it won’t last. And it may not matter. Trustworthiness does not guarantee a loyal relationship with members. A decade-old report from Filense still rings true: “Consumers consistently identify credit unions as trusted financial providers, but that bond doesn’t translate automatically into increased wallet share or membership growth. In transitional times like these, credit unions may survive by maintaining historical goodwill-but they can’t win without a focus on delivering functional as well as emotional value to the consumer.”
A solid Member 360 can highlight low points in the member journey, missing touchpoints, and members’ communication preferences. As exciting as things like AI and advanced loyalty programs seem, credit unions can easily jump ahead instead of fixing the basics.
Member 360 can inform and enhance members’ interactions through digital channels and person-to-person communication.
Navy Federal Launches New Initiative to Continue Its Member-Focused Mission (cutimes.com)
How Credit Unions Can Accelerate Digital Transformation | Credit Union Times (cutimes.com)
https://www.cubroadcast.com/uploads/2/2/3/4/22347314/casestudyaffinitypluscumember360_20120503.pdf
https://cu-2.com/cu-member-acquisition/
https://www.cutimes.com/2017/03/13/why-credit-unions-need-to-focus-on-member-ltv/
https://www.medialogic.com/blog/financial-services-marketing/credit-union-member-loyalty/
https://smallbusinessbonfire.com/customer-retention/
https://www.cutimes.com/2010/03/16/filene-report-focuses-on-gaining-more-wallet-share/
How can you design a data analytics platform that follows best practices, yet remains flexible to your unique needs? This is daunting.
In this article, I’ll break down a reference architecture, with a special focus on the needs of credit unions. Here is how a reference architecture can help:
Designing a data analytics platform for credit unions can be a complex task, requiring a deep understanding of the organization's needs and priorities. Having a reference architecture helps. It provides you with a solid foundation for designing a data analytics platform. It creates a standardized approach to data management, data governance, and data usage. This makes it easier to design, build, and maintain a data analytics platform, ensuring that it meets your specific needs.
By having a well-designed data analytics platform, credit unions can improve their business outcomes in several ways. They can leverage data to generate higher ROI on marketing activities, improve member experience by offering personalized products and services, and increase operational efficiency by automating processes. These benefits can help remain competitive in the financial industry, and provide value to members.
This reference architecture is derived from Datateer’s Managed Analytics service, which powers data analytics for all of our customers. It is the result of years of refinement, hardened in the messy real world.
In a later article, I will address applying the Simpler Analytics framework to your data analytics operations. That framework focuses on process and organization, and is complementary to the reference architecture described in this article.
Let’s dive in.
First, let’s visualize a mature architecture. But you should not attempt to build everything at once! Further down, I will break this down into an approachable roadmap.
Below is a diagram of a mature system, emphasizing the flow of data from raw data sources to audiences that consume analytics.
Here is a breakdown of each component, why it matters, and examples of vendors that provide products or tools for each
The general outline is:
Before digging into each component, check out our Guide to Managed Analytics, explaining a type of service that manages your data analytics for you.
A data warehouse is a database. In spite of all the marketing messages about innovation (which is true, there is a lot of innovation happening), in the end this is a place to store data, with a SQL query interface on top.
The warehouse is the core to everything in your data analytics system. Leading vendors are Snowflake and Databricks. Google BigQuery, Amazon Redshift, Microsoft Synapse are strong offerings tied to their cloud platforms.
A data lake is a place to dump data, a centralized place where it is easy to collect data from operational systems. The key challenge with ingesting data is finding the right balance between:
A data lake is the bridge between ingestion and structured analytics. At Datateer, we use a simple setup of storage buckets in AWS or GCP. An alternative is to designate a segment of a data warehouse as the lake. Snowflake and Databricks both are pushing towards this pattern (in part because it increases your reliance on their products).
The lake handles what we call “preprocessing” which is basic cleansing and technical checks.
But the lake should not apply any business logic. The data should be structured and accessible as it arrived from the data source.
Ingestion, or data replication, is all the processes involved in extracting data from data sources like operational systems, SaaS APIs, application databases, etc. The goal is getting data from all areas of your business into a central location so that the data can be analyzed collectively.
The most important part of the ingestion component is deceptively simple. Modern data analytics follows an extract-load-transform pattern (ELT), not the older extract-transform-load pattern (ETL). By waiting until after data is extracted and loaded (EL) before performing transformation (T), you remove many problems that have long plagued data analytics efforts.
At Datateer, we have found it helpful to define a set number of extraction strategies or patterns. When we are dealing with a new data source or trying to understand an existing one better, it helps us to know which extraction pattern is applied. Here are some examples:
In the past 5 years, specialized vendors have created many prebuilt extractors. The cost and reliability of these means custom-built extractors should be a rare case. At Datateer, we only custom-build about 15% of data extractors. Credit Unions have a number of niche systems like core systems and loan origination systems. You may need to build 40-50% of your extractors.
We have had positive experiences relying on extractors from Matillion, Portable, and Fivetran. For custom-built extractors, we find success with Meltano and Airbyte.
Transformation refers to combining, aggregating, normalizing, and performing calculations. It’s all the code and logic to get data from its raw state into an analytical data model designed for reporting. The leading solutions in modern data analytics are Matillion and dbt Labs.
With so many moving parts, orchestration is a key component of modern data analytics.
Orchestration includes scheduling of technical jobs, dependency management, and routing status and issues.
Vendors we have had positive experiences with that specialize in data pipeline orchestration include Prefect and Dagster. Airflow is a mainstay, but we found it to be monolithic and outdated for our needs. Products like Matillion provide a more integrated solution, combining capabilities of orchestration and transformation into a single user experience.
Credit unions must comply with regulatory requirements and data privacy laws. For many credit unions, this is a key risk that slows down innovation. PII and personal financial information is tied to every service a credit union offers.
Governance requires policies and standards about who can see data and how they can use it. It also requires tooling to implement and enforce these policies.
An unfortunate reality of data governance is that it can balloon quickly and get complicated. The innovation it is intended to unlock becomes even more bogged down by an unwieldy data governance process.
ALTR is a data access and control vendor that specializes in the financial services industry. Immuta is also a popular vendor.
One of the biggest killers in any data analytics is poor data quality. This often stems from incomplete or inaccurate data from the upstream sources. But it also results from timing issues, calculation errors, or changing business needs that render existing metrics insufficient.
No mainstream standard exists to measure data quality. At Datateer, we like to split it into two separate concepts: fitness for purpose from a business perspective, and technical health.
However you organize, you need tools for automatically measuring quality, troubleshooting, and managing issues and resolutions.
We use a variety of tools at Datateer around data quality. Vendors that provide observability and testing include Metaplane, Monte Carlo, Datafold, Soda SQL, and Qualytics.
Business intelligence is the tip of the iceberg–it is what everyone thinks of when they think of analytics, because it is the most visible piece.
Credit unions need curated, managed, controlled reports and dashboards. If you also provide exploratory tools, you will get much more engagement and build momentum.
Some products, like Sigma Computing, Astrato, and Hex, focus on curated dashboards as well as exploration use cases.
Vendors that focus only on dashboarding and reporting include Tableau, Power BI, and Qlik.
Activation focuses on scenarios outside of business intelligence. It’s a big topic that can be summarized in a few patterns:
“DataOps” or Data Operations refers to keeping the system running and improving.
A key situation is when something breaks in the orchestration. With so many moving parts, including upstream dependencies on data sources, things are bound to go wrong.
Requests come in for analyses and help with all kinds of questions around data. Managing and fulfilling these requests is a service a central data team can provide to the broader organization.
Many systems targeted to IT are suitable for data operations. At Datateer, we use Freshworks. The key is integrating the orchestration tool with the operations tool.
Many credit unions have been hesitant to embrace cloud infrastructure. This is changing from pressure from members and partners, maturing security by cloud providers, cost improvements, and scalability needs of credit unions.
The decision of which cloud infrastructure provider to use is typically larger than the data architecture planning.
If a credit union is lagging in maturity around cloud adoption and security, a data infrastructure is an excellent opportunity to advance the organization’s capabilities as a whole.
Many data product vendors will support AWS, GCP, and Azure.
Creating this from a blank slate can be difficult. However, I am certain you are not starting from a blank slate! Legacy systems, difficulties in accessing data, and ambiguity of critical entity definitions are all too common in credit unions.
The way to simplify this is to build in layers. Starting with the components that will bring the most immediate value, you can then layer on more capabilities. Thinking in iterations can transition naturally into a roadmap. Below is an example of a series of iterations to do just that:
Avoid getting bogged down with all the underpinnings and start with a single audience and single analysis use case. This still requires much technical implementation, so make the business scenario as narrow as possible.
In this stage, stand up a warehouse and lake, and a BI tool. Choose two systems to begin ingesting–the core system and a CRM are good places to start. But don’t ingest everything–focus on the data from those systems that support your narrow use case.
Going very far without ensuring governance and security measures are in place puts too much risk on a credit union’s data operations.
In this stage, automate the use case from level 1 with orchestration, and add one or two more use cases. Keep them narrow. Your analyses do not need all the available data at this point, so defer what you can on the ingestions.
Apply a layer of data governance (recognize that this will mature over time).
As you continue to increase supported analyses and data sets, the reliability of the system and quality of the data become critical to continued momentum and success.
Start your data operations by establishing processes to handle requests and incidents.
Implement data quality tooling. Include measurements of technical health as well as accuracy and completeness from a business perspective.
Now that you support a variety of subject areas, consider how to activate data beyond BI and exploratory tools. Consider how to automatically push data into operational processes, member-facing touchpoints, and self-service discovery.
Congratulations! You are well on your way to a data architecture based on proven practices. This will set up your credit union for success in data analytics and provide a foundation for future growth.
Was this article informative and helpful? Register to get the other articles in this series:
Credit unions are facing increasing pressure to meet the rising expectations of their members. More people are joining credit unions as members, and they are bringing their deposits with them. To remain competitive, credit unions must invest in technology, and data analytics is no exception.
Organizations that have implemented modern data analytics are reaping the benefits. They are able to provide a more personalized member experience, identify potential risks and opportunities, and make more informed business decisions. We will explore some of these case studies later in this article.
Data analytics allows credit unions to better understand their members' needs and preferences. With this information, you can tailor services to meet the specific needs of members.
Yet many credit unions struggle to modernize their data analytics capabilities. This can be due to a lack of resources or expertise, as well as outdated technology and infrastructure.
To remain relevant and competitive in today's market, credit unions must prioritize modernization. This means investing in the necessary technology and expertise to gather, analyze, and act on data insights. By doing so, credit unions can better serve their members and stay ahead of the competition.
In this article, we will look at why you must modernize your data analytics, examples of success from other credit unions, and potential obstacles.
Membership and member deposits are on the rise. Not only is membership increasing, the rate of increase itself is accelerating.
According to data from CUNA, even before the COVID-19 pandemic membership rates were accelerating. During the pandemic, membership increased by 4.7 million in 2020–the largest annual increase on record.
In 2023, the collapse of SVB and other entities led to general fear of small banks. The largest amount of withdrawals from small banks have occurred, with deposits to large banks and credit unions increasing.
Credit unions are investing in data and analytics. No matter what business cases you pursue, they will all require a solid foundation. Many credit unions are dealing with legacy core systems, siloed data, and older technology and processes. Modernizing these is the only way to establish a solid foundation for data analytics.
Members are customers of many other businesses. Generally, people’s expectations of customer service and customer experience are rising rapidly.
One strong use of data is a simple Member 360 profile. Gathering data about members and all their interactions creates a single data set that anyone can reference.
Customers of any business expect a consistent omnichannel experience. They expect you to have the right info–however they interact, whenever they interact, and with whomever inside your organization.
What becomes possible–or at least much, much easier–with a solid data analytics foundation in place?
First Tech Federal Credit Union leveraged data analytics to improve its loan portfolio management by implementing a machine learning model. This model analyzed borrower characteristics, credit score, and payment history to determine the probability of default for each borrower. Based on this analysis, First Tech was able to identify potential risks in its portfolio and adjust lending policies accordingly.
For example, First Tech noticed that borrowers with lower credit scores were more likely to default on their loans. To address this issue, the credit union increased its minimum credit score requirement for certain types of loans, resulting in a decrease in delinquency rates.
By using data analytics to monitor and adjust its loan portfolio management, First Tech saw a 10% decrease in delinquency rates, enabling the credit union to offer more loans and improve customer satisfaction.
Data analytics has become increasingly important for credit unions to improve member service. One great example is how DCU (Digital Federal Credit Union) leverages data to provide personalized service to its members. They analyze member data to gain insights into their financial habits and preferences. This allows them to provide tailored recommendations and solutions to help members achieve their financial goals.
Additionally, DCU uses data to monitor member feedback and complaints. They can quickly identify areas where they need to improve and take action to resolve member issues. As a result of their data-driven approach to member service, DCU has consistently been rated highly in customer satisfaction surveys.
Data analytics can also directly impact a credit union’s ability to identify fraudulent activity, see a complete Member 360 profile, understand the member’s “customer journey” with a credit union, improve investment portfolio management, and improve risk management.
I will spend some more time on this in an upcoming post.
The general outline is:
Learn how to overcome these obstacles and more with our Guide to Managed Analytics.
This can come in many flavors. It often manifests as a lack of specialized expertise in the organization or budget constraints.
Credit unions often look to improved tooling like Matillion and specialized services such as Datateer’s Managed Analytics.
Financial institutions are among the most highly regulated organizations in the world. Security requirements are high. Compliance with regulations of data privacy and usage restrictions are non-negotiables.
For these reasons, executives and boards have historically been hesitant to even consider cloud-based solutions. However, most modern innovation is in cloud-based solutions.
Recent years have seen major improvements to unlock approval for credit unions to adopt modern, cloud-based data analytics solutions. Examples are security frameworks from organizations such as Center for Internet Security, and specialized solutions like ALTR. These are established, proven practices for solid data governance in the cloud.
Traditionally, showing value from large, technical efforts like data analytics takes a long time. But taking months–or even years–before making business impact is not acceptable in today’s business environment.
Fortunately, newer frameworks and tools are rising, such as the Simpler Analytics framework, Sigma Computing for BI, and dbt for metric calculations. These tools turn everything on its head. Instead of long, technical efforts, they enable shorter iterations and faster time to value.
In a future article, I’ll demonstrate how to apply Simpler Analytics to credit union data analytics.
Unfortunately, this one won’t go away. Data quality is not a technical concept. Beyond technical concerns like accessibility and reliability of data, data quality is a business concept that treats data as an asset.
It is a “never finished” type of issue–data is never perfect.
Also unfortunately, data quality remains ambiguous–no universal standard exists. Within your organization, establishing a working definition of data quality and how to measure and report on it will reduce these issues.
Page 1 of 8