Accelerating Modernization

Kelly Carrigan
July 26, 2022
Advanced Analytics | Blogs

Today’s executives are faced with multi-faceted challenges when guiding their organizations. Almost all would agree that speed-to-market, Cloud adoption, available skilled labor, and aging infrastructure top the list. The dynamics of any organization will greatly influence how these challenges will be tackled. Oftentimes, this ranges from the leadership team’s collective vision to the staff’s collective buy-in of that vision. Sandwiched somewhere in between is the magnitude of the legacy footprint.

The fact remains that leadership is tasked with leading, and that means making hard decisions that drive the future. That future must embrace the need to modernize, and the time is now. So, what is the best way to approach this? Is it wholesale change in Boiling the Ocean, or transitional change by putting a Stake in the Ground? And, more importantly, what is this approach being applied against? Is it a new feature request, a business acquisition, a Greenfield project, or simply research and development? No matter what the application, the approach, and the technologies are the keys to success.

The Data Management ecosystem remains one of the most significant areas for an organization to embrace modernization. This is easy to see as most organizations have some sort of data aspirations, with a legacy attempt at delivering on it. Even the most buttoned-up environments have many moving parts that have been stitched together over time. It can be an overwhelming and daunting task.

Once there is acceptance of the need to modernize and the technologies are chosen, figuring out how to make the transition becomes the harder part.

One possible approach that opens the door for the shifting of the processing, while providing immediate resources for new business capabilities, is dual feeding both the legacy and future platform. On the surface, consumption-based technologies ease the financial burden of getting started and make this worth considering. Of course, knowing where you are headed is critical, since dual environments can only be sustained for so long.

So, what often remains the single biggest obstacle to accelerating modernization is simply getting started. This is because opinions differ on what this path looks like, and there are inherent risks associated with each. Proven reference architectures and skilled resources are major contributors to mitigating that risk.

Technologies such as Snowflake have revolutionized this segment and been a major catalyst for Cloud migrations. Snowflake’s platform has proven itself capable of addressing workloads for:

  • Data Warehousing – data volumes, compute processing, and compute isolation
  • Data Lakes – containing structured and semi-structured content such as JSON and XML, thereby allowing for schema-on-read at scale
  • Data Engineering – pipelines allowing for macro, micro, and streaming loads
  • Data Sharing – allows entirely new and simplified ways of exchanging data between internal and external business units
  • Data Applications – compatibility that fully addresses reporting and analytics
  • Data Science – extensions for full Machine Learning and language libraries

Because Snowflake is a pure-play SaaS, implementing and deploying is easy. So easy, in fact, that it is tempting to bypass some very important upfront considerations.

Intentionally bypassing steps for the sake of time when conducting a POC is commonplace. However, POCs many times include ‘quick & dirty’ compromises that were never intended to go to production. So, understanding those architectural implications at the beginning is critical. This is especially true in the areas of:

  • RBAC – Role Based Access Control
  • Cloning
  • Schema organization and management

Let’s look briefly at why each of these 3 has implications that need to be understood.

RBAC – Role Based Access Control

Rather than users or schemas owning database objects, Snowflake uses a RBAC model and supports full inheritance through roles. This allows for clearly defined Business Function roles to be implemented separately from Object Access roles. Technically speaking, there are no inherent differences in what is allowed by either. It really just comes down to how the roles will be used and managed. This is usually the first thing to be bypassed in a POC, as it can slow down the process and require upfront thought as to how the legacy approaches may or may not have worked. As a result, you will many times see internally and externally run POCs with all database objects owned by SYSADMIN. This is one of the first things that must be addressed when planning for production.

Cloning

Full copies of data can be accomplished in Snowflake through cloning, which can be specified for databases, schemas, or tables. Effectively, these are pointer-based snapshots that only consume space for the partitions that change from their original state. This becomes incredibly useful for Dev/Test provisioning and object migration and can radically change the way DBAs and System Administrators provided services. For example, having a full copy of a production database for use as Dev/Test may be cost-prohibitive in legacy technologies, whereas it is a non-issue with a solution like Snowflake. While conducting POCs, it is important to understand the types of processes that will likely be handled differently with modernized technologies.

Schema Organization and Management

Like other databases, Snowflake uses schemas to house database objects and fully supports 3-part names for object reference. Depending on the size and scope of the POC, named schemas may or may not be used. If not used, then objects will likely be placed in the Public schema. Additionally, to better manage the grants for objects within the schemas, Snowflake allows for the creation of Managed Access schemas, which essentially promotes the grant privileges up to the owning-role of the schema rather than the individual owning-role of the object. Again, this will likely be an area that is bypassed during POCs but needs to be understood when planning for production.

Converge Technology Solution’s proven methodology works jointly with customers to explore and vet these modernization opportunities and all the implications when implementing them. Understanding the environment to ensure the best possible fit is where we begin.

Keys to Accelerating Modernization

  1. Embrace the need to modernize
  2. Share that vision to gain buy-in
  3. Ensure objective views on possible technologies
  4. Leverage experience of proven methodologies, architectures, and technical resources

While we have discussed management vision, transitional approaches, all the way down to technology capabilities, we have only scratched the surface when it comes to modernizing.

Partners such as Converge Technology Solutions can offer help navigating through this. Contact [email protected] today!

Follow Us

Recent Posts

NIST CSF 2.0 Gains Ground as Universal Cybersecurity Framework

As consultants on the Governance, Risk, and Compliance team at Converge, we’re often contacted by clients after every major cybersecurity or data breach incident hits the news. Their common question is, “Can this happen to us, and how can we be proactive?” They want...

Unleashing the Power of the Cloud: Beyond a Migration

The pace of technology innovation is driving organizations, large and small, to continually seek ways to stay ahead of the competition and remain agile. One key transformation reshaping technology across the globe is the migration of workloads to the public cloud....

Want To Read More?

Categories

You May Also Like…

Let’s Talk