What I would say is that documentation is never in production, but the semantic layer always is. Therefore, it is kept up to date, or people's reports don't do what they want. Whereas, with documentation it goes stale without anyone realising.
Thanks for linking the article from Rohan, this is stellar!
Sadly it looks like not much has happened since 2022 when I wrote this Metric Layer overview, except of course the acquisition of Transform/MetricFlow.
Linking the original article in the case anyone interested:
Hi Austin! I continue to believe that investing in a separate dedicated semantic layer doesn't make sense for us at this time - both because our existing Lightdash semantic layer has expanded its feature set (e.g. they've added a python package to access metrics/semantic entities externally since then), and because I don't feel confident in the other two solutions.
I know this is a little older article, but I'd be super interested in an updated opinion on these semantic layer capabilities. We are in a greenfield organization to develop a BI and Analytics ecosystem and I'm evaluating AtScale, DBT Semantic, and Cube js. I'm super interested in the latest opinions here. I love the collaboration possible with DBT Core for ELT, but I'm not as stoked about their semantic layer capability.
I think between those three, what we said mostly holds true. Cube has some quite advanced agent capabilities now on top on cloud.
I would really look at what Databricks and Snowflake have now too. They are really progressing fast. It’s actually the ideal place in many ways to have an SL due to closeness of integration with DWH SQL. I’m about to deploy on Databricks soon.
Great article! The challenge with both options is the need to manually document the semantic layer. AI can help... a little... but you need a dedicated solution for it. I actually just share some content about it today: https://journey.getsolid.ai/p/semantic-layer-for-ai-lets-not-make?r=5b9smj&utm_campaign=post&utm_medium=web&showWelcomeOnShare=false
What I would say is that documentation is never in production, but the semantic layer always is. Therefore, it is kept up to date, or people's reports don't do what they want. Whereas, with documentation it goes stale without anyone realising.
Agree 100%
Thanks for linking the article from Rohan, this is stellar!
Sadly it looks like not much has happened since 2022 when I wrote this Metric Layer overview, except of course the acquisition of Transform/MetricFlow.
Linking the original article in the case anyone interested:
https://medium.com/@vfisa/an-overview-of-metric-layer-offerings-a9ddcffb446e
It might be interesting to get Rohan's opinion on other metric layer technologies mentioned.
Yes in some ways things have gone backwards but the remaining players like Cube and dbt have improved their offerings.
Hi Austin! I continue to believe that investing in a separate dedicated semantic layer doesn't make sense for us at this time - both because our existing Lightdash semantic layer has expanded its feature set (e.g. they've added a python package to access metrics/semantic entities externally since then), and because I don't feel confident in the other two solutions.
I know this is a little older article, but I'd be super interested in an updated opinion on these semantic layer capabilities. We are in a greenfield organization to develop a BI and Analytics ecosystem and I'm evaluating AtScale, DBT Semantic, and Cube js. I'm super interested in the latest opinions here. I love the collaboration possible with DBT Core for ELT, but I'm not as stoked about their semantic layer capability.
I think between those three, what we said mostly holds true. Cube has some quite advanced agent capabilities now on top on cloud.
I would really look at what Databricks and Snowflake have now too. They are really progressing fast. It’s actually the ideal place in many ways to have an SL due to closeness of integration with DWH SQL. I’m about to deploy on Databricks soon.