Providing both performance and flexibility are often seen as contradictory goals in designing large scale data implementations. In this talk we will discuss techniques for denormalisation and provide a framework for understanding the performance and flexibility implications of various design options. We will examine a variety of logical and physical design approaches and evaluate the trade offs between them. Specific recommendations are made for guiding the translation from a normalised logical data model to an engineered-for-performance physical data model. The role of dimensional modeling and various physical design approaches are discussed in detail. Best practices in the use of surrogate keys is also discussed. The focus is on understanding the benefit (or not) of various denormalisation approaches commonly taken in analytic database designs.
Stars, Flakes, Vaults and the Sins of DenormalisationInnovation & Technology (CTO) Curriculum Electives, Data Management, Data Engineering Level 1, AI Engineering Level 1, Data Governance Level 2, Innovation & Technology (CTO) Level 2, Executive Level 2, Stephen Brobst, Data Governance Curriculum Electives, Executive Curriculum Electives, AI Engineering Curriculum, All Academy Courses, Data Engineering Curriculum