Elena has recently started a new role as Sr Manager, Data at Bubble.io and is getting ready to launch a Product Analytics course on CoRise, previously she has also led Product Analytics and Data Science at Peloton and Spring Health. She is also speaking at MDSFest, so watch out for the product analytics panel there!
I'm pleased to start this series with someone who has been a practitioner for their whole time in Data. This will act as an anchor for the rest of the series, where we will have other folks who have also worked in consultancy and at vendors.
I realised that I hadn't had the chance to speak to Elena synchronously before, so we did this interview on Zoom and I used the transcript as a base for this post - it may sound more conversational than some future posts.
Education
I studied Economics. It lends itself well to a data science career, especially as my concentration was in applied math and microeconomics. I had many semesters of econometrics and linear algebra, lots of theory. I wish I understood the actual applications of it back then and knew what questions to ask!
I got my degree back home in Russia at the Moscow State University. And then I went to grad school again - in the US. Here I studied sports business as my first career was in sports.
What did you do in sports?
I did everything: sponsorship sales, ticket sales, media operations, social media, a little bit of product management, event organization, sports writing. Almost everything under the sun.
Wow, that's a whole career.
Yeah, I did it for six years. And the goal was to continue on in the same space in the US. But didn't work out for visa reasons. The organizations I was most interested in were NGOs and not in a position to offer visa support. So, I had to go into tech for a better shot at a work visa.
So how did you move from there into Data? Was that the transition from there into Data, or was it somewhere else?
There was a startup called Repucom, now acquired by Nielsen, that did sponsorship ROI measurement for sports properties. They had their own measurement formula. They captured the broadcast of, let's say, a soccer game, and then a combination of technology and human tagging would look at every video frame. Where is the sponsor logo? What part of the screen it's on, what size it's on, what type of logo it is? Is it LED or static? And then, based on all these parameters, it would assign a specific coefficient to it, and then you overlay it with the TV ratings data for the sponsor's target demographic. Long story short, it spits out the dollar number, and you can say - ok, this number is more than you paid for the sponsorship package.
That was my transition to tech, although not super high tech. Lots of spreadsheets, Excel for the most part, but it was kind of a good segue into being in the American workforce and to get the lay of the land, and then after that, transition to greener pastures and challenge myself more.
Can you share an overview of where you've worked in Data and what kind of data roles you've held?
I think the most relevant would be my last couple. I was at Peloton for about four years. I built out their entire Product Analytics org from the ground up. I initially joined as the first person on the product team with data chops. And after four years, I built my team out to 14 people, supporting the entire product portfolio. And I brought one of my previous coworkers to be my peer on the e-commerce side. So, the whole product analytics org was my baby. Then I worked at Spring Health for about a year as the lead of a product analytics-focused branch of their central Data Science team.
What made you move on from Spring Health?
I had a great team team - a very diverse, talented group of folks, a great manager, interesting problems to work on, but the scope of my role evolved into something more operations and process focused, and less involved in strategy than I would have liked. I would also say that after Peloton and Spring Health I reached an uncomfortable degree of accumulated burnout.
That's a real problem. So you are still working as a practitioner at Bubble? What does Bubble do?
Yes, my team is small, so I am at the moment very hands on - although it is all very new! Bubble is a no-code software development platform that lots of founders use to build out the first iteration of their product without the need to hire an engineering team. And the platform is sophisticated enough to also let them grow and scale when their product gets to product-market fit!
And does it use AI to do this? There's so many taking this approach right now.
The core product has a very robust editor that lets you build out not just the front end, but also your production database, your back end, API connectors, all in a point and click fashion. As far as AI, the platform plays nicely with OpenAI API and of course as with most companies now there is constant evaluation for other productive ways to integrate it into product experience.
Human Interfaces which worked well
In product analytics, which has been my focus for the most part, this is something that is generally extremely important, as you both have to service a lot of stakeholders or, I guess, form lots of thought partnerships, but you also need to influence a different subset of partners. For example: software engineering. At Peloton - even within the same company - I've had good interfaces with some engineering managers and quite unproductive ones with others, which is pretty interesting.
How to achieve a good interface? It is very important to be intentional about how you welcome cross-functional partners into your ecosystem and how quickly you show them the mutual benefit.
Otherwise, it could be a one-sided interface. In this example, a software engineer could always feel like I need something from them. I need them to build my telemetry, I need them to do something differently with the production database. And if they don't understand what is the benefit to them as a person, or to their team, it's really hard to achieve the perception that data work is part of the product versus something that’s optional.
On the other hand, some engineering managers would always know that they have to ship new features with logging instrumented. It was scoped for in advance, never last minute, never a fire. Coincidentally, within these same engineering pods all engineers knew how to use the self-service analytics tools, how to take advantage of the data that they'd implement, and would use it on their own; maybe to debug something, to investigate a customer ticket, or to look at the adoption of the feature they built.
It is a very interesting kind of human problem to think about. How to improve the relationship and gently shift the mindset from: "all this data stuff is just distracting me from building the product" to "this data stuff is actually essential for building the product and is a part of the product".
So what was different about those people who got it straight away?
Also, where you've managed to convince someone who didn't think about telemetry as part of the product - I'd like to drill into how you were successful then. What was special about those people, as well?
I think part of it was probably just their background and previous experiences, which is obviously not something that I can impact. But it's important for me to know, so that I can choose my approach to that person.
But I think those software engineers, were they like data scientists or data people before, is that what you found?
Not necessarily, but they may have worked at companies that had good data culture… I think maybe it even goes to not just the partnerships with data people, but broader. At their previous company, were they treated as a feature factory where they get very defined requirements and they execute on them, versus more like a thought partner, where engineers are truly empowered to contribute to the solution. So, potentially that's also the difference. It's not just the data relationship, but more so how they perceive their role. At Peloton vs Spring Health, for example, this was very different. Peloton was more geared towards empowerment across product management and engineering; at Spring Health, from my perspective, there was less empowerment, more “waterfall” and more of building what was defined in stages by different sets of contributions.
So to overcome an unproductive relationship, what I was trying to do is, to show them the magic. Let's take telemetry as an example - they have to see the magic of it and have their AHA moment.
Same as product management, right? Your user would have an AHA moment when they start using your product. And for engineers it's the same thing. So, figuring out what that AHA moment would be for them, which for some folks it could be like, oh, I get a support ticket where a member is complaining about a specific thing. Let's say I can open up Amplitude and I can bring up that user and see what's going on. Now they're like, oh, now I get it. Now I know why you want me to name things in a certain way, because now I don't have to even look in a data dictionary or in the schema. I can actually understand what's going on. Of course, product analytics is not necessarily for just debugging things. But it is a fringe benefit - so easy!
When folks are joining the company, having them onboard onto our product analytics tool early on - in the first two weeks - that's something that I've definitely seen have a high correlation with becoming good partners. For Segment and Amplitude, our best engineering manager partners would say: "oh, but everybody on my team has access, it is in my onboarding checklist, I make sure that everybody logs in”.
You have to make folks aware that's part of the stack. Here's the training, so you can see what it does.
I think it all comes down to this: it has to be beneficial for them. If it's just something that they're doing for kind of the grander good of the company, it just doesn't resonate quite as much. There has to be something pragmatic about it.
Human Interfaces which didn't work well
I already gave an example of engineering pods where I couldn’t quite get traction as I missed my chance to bring folks into the “magic” early on.
The other example would be cross-functional stakeholders that didn't have hands-on dedicated support. I think these are usually much harder to navigate than building interfaces with product squads with an embedded data person. In the latter case, the interface is less transactional. They have that thought partnership. They talk every week.
If there is, let's say, a different team that we support on a more ad hoc basis - I had that both at Spring Health and at Peloton - that can devolve very quickly into something a bit unproductive, where, as a manager, I have to protect the bandwidth on my team and constantly balance it with what the company needs.
But I also have to be strategic about it, where, let's say, maybe I don't particularly want to do a given request compared to other things on our plate, but it would be good to just do a solid for this partner, so that in the future they will help us with information or an impactful opportunity.
Trying to balance all of these things for each specific instance of being asked to do something, there's a decision tree in my head of, okay, does it actually make sense what they're asking, what's the impact of it? Is it good from a, for lack of a better word, diplomatic standpoint, to do a solid for this stakeholder? Is it good to bank a little bit of the positive points for our long term relationship? And is there bandwidth on my team? Do we even know enough to answer this question?
Lots of the factors above help formulate the decision of how to support a stakeholder. And it's been sometimes hit, sometimes a miss. It tends to work better where stakeholders are willing to do some work - especially having significantly more subject matter expertise, where you get them started on something and then maybe teach them to fish a little bit and get them into their own exploration. It doesn't work as well where there is no bandwidth or there is more of an expectation to have lots of handholding. That is usually hard to achieve on a bigger team, especially, and in a faster-paced environment.
I won’t lie, I've gotten some feedback at Spring Health of like, hey, you need to be more collaborative. I’ve fallen into a pattern of saying ‘no’ too easily as we were getting pretty overwhelmed with requests.
That's not usually something data people are accused of: saying no too easily. I think that's probably a good sign.
Yeah, maybe. My approach is a work in progress. When I came on board at Spring, looking at my manager and some of my peers that led different parts of the data function there, I was like: “Oh, that's how you say no firmly”. I had a huge problem with saying no! So I said: “I'm going to learn from you all”. And I guess I learned too well. At the next performance review? Both me and some other folks got feedback that like, hey, we really want you to be more collaborative. We want you to be less kind of staunchy with what you do and what you don't do. But given the pace of a hypergrowth company, some reorgs, changing priorities, it was also impossible to keep up if you weren't really good about setting your boundaries and saying no. Every company definitely comes with their own cultural quirks.
Human Interface which should have existed but didn't
That's a good question. What was interesting at Peloton is that the data team was fully decentralized. So actually, the interface that was missing was between all these decentralized data teams. When the company got really big, it got progressively worse because there were more and more new data teams forming. We had marketing analytics, content analytics, supply chain, apparel had their own analyst, finance had their own… so it was really a lot. And we came to a place where we could not effectively solve big problems that went beyond a single team’s scope. So when, let's say, we launched the new treadmill model, the sales weren’t initially in-line with expectations.
This is an example of where, okay, it's not just a product analytics thing, right? Part of it, yes, we have to look at once the users buy it, how they engage with it. But there's also a big marketing component to it - how is it positioned, how do we get someone to buy. And also - was the sales forecast reasonable? Did we estimate our SAM and TAM correctly? That’s on the strategy and market research side. So how do five analytics and research teams come together and tackle the question of product market fit for Tread?
It was challenging, because we really did not have a process, we did not have a human interface. We had our own separate roadmaps. And it was even hard to get people in the same room and get them to share openly with each other what they've learned so far. Maybe there is some fear of finding which functional team would be “to blame”? Another thing about it is all being a public company that is also in the top of the news cycle, there's lots of confidentiality and certain things maybe wouldn't be shared until the last minute or people would be hesitant to share until they're fully polished. So, that was an example of a human interface that didn’t exist within the data world.
A different example that comes to mind would be the missing interface between data and senior leadership where the data folks really get to contribute to the strategy versus getting the outcomes of strategic discussions as a to-do list after these discussions have already happened without data folks in the room. I think this is probably a common problem.
There's a lot of discourse - I think Benn wrote about it, too, about the “Missing Data Executive”, of having somebody who is really at that level and understands what's going on and can participate in the discussion. So, I think that's quite a common problem. At Peloton I maybe didn't experience it as much because I had a really good relationship with the Chief Product Officer, I met with him one to one a lot, so I got more visibility into his thought process. Spring Health had more of a structured approach, where there was a notion of an extended leadership team and if you're not a certain level of seniority, you're not there. It was very much more formal.
What are you doing in your current role to make these interfaces better?
I just started this week, and one thing that I'm trying to do very differently from how I started out at Spring Health, and which potentially did not set me up for success, is to spend more time learning about all these cross-functional folks that my team is going to support, before jumping into the action.
I think it can be intimidating when you’re coming into a bigger company with a small data team. Coming in as a more experienced person, there's lots of expectations: "she's coming in and she's going to solve all our problems". This creates pressure. At Spring Health, I had succumbed to that pressure - I even had to write a roadmap or OKRs on my first week there and jump into things very quickly. I think I skipped that step of really understanding the landscape and maybe actually saying no a little bit more in the beginning. I just fell into responding to the expectations and this is something I'm trying to do very differently at Bubble. Their culture is more tolerant of this; to really spend time getting to know people better, getting to know all the function leads, getting to know the founders a little bit better.
I want to spend time digesting and structuring this information in my head before making a plan of reasonably supporting all functions with limited resources and workshopping with my manager what’s the best way to socialize it.
I'm trying to be much better at being more intentional about building out these interfaces and really getting to know people. I guess, going back to product management, right? Like, really getting to know your users. I'm doing “user research” right now and then figuring out their main pain problems and pain points. Then, what are some ways that I can solve them in different forms? What would be the MVP? What would be the more full-featured support?
Key Takeaways
The turnover rate for Data Leaders is incredibly high (18 month average tenure last time I saw a stat), whilst I’m sure that lack of support, structure and tooling plays into this number, burnout inevitably also plays a part.
I think the importance of incentives, when working with non-technical stakeholders, has become well understood amongst data folks. However, the importance of incentives when working with engineering teams can be forgotten - they also have incentives to work with you, as Elena has described well!
Elena really goes into the nitty gritty of the truth about working with other people, and how it often involves compromise and negotiation. Sometimes I feel data folks, aren’t willing to engage at this level - they can have a very idealistic perspective. The truth is that this is how you need to get things done in business at times. There’s nothing wrong with a bit of horse-trading!
Knowing who you are dealing with, where they came from and how this shapes their current outlook is something so important, and Elena explains this really well above.
Elena also goes into what I would describe as “sales” too, you need to “sell” your data product to your stakeholders. Yes they might have asked for it in your job description and in their requests, but getting them to their “Aha!” moment is crucial to delivering value. Don’t ignore this last leg of the data value relay race, we devalue all of our prior Data and Analytics Engineering work by neglecting it. She also even describes proactive sales - making sure that new starters understand the value in the data and data tooling available.
The missing interface between distributed data teams or parts of a data team is a common problem I hear. I still stand by hub and spoke as a solution for this - it requires skilled management, being able to adapt to new situations well, being able to justify budget for sufficient hub/central resource… who would have thought the best way was also the hardest work and easiest to mess up? I don’t think this is a reason to avoid it and go for the extremes of centralisation or fully distributed either, we shouldn’t shy away from the hard parts of management if we end up there. I wonder if hub and spoke data team management could be it’s own CoRise course 🤷♂️.
The missing interface between data leaders and company leadership is well documented now, but remains a big problem. I’ve had the kind of relationship with a CXO leader Elena described at Peloton, and then seen how it all goes wrong when they leave - it’s brittle in this way.
I love what Elena has described she is doing at Bubble, in taking the time to really understand her situation, stakeholders and their needs before deciding how to take action. User Research is definitely something data folks should do more of.