Home : Resources : DTO Interview
Interview with the Data Transformation Office (DTO) Team
In February 2022, the P2P team sat down with members of the Data Transformation Office (DTO) team currently supporting the Navy’s data enablement effort on Jupiter. Below is an exclusive interview with CAPT Michael O’Leary, the Military Deputy and Executive Transformation Management Lead at DTO, and Andrew Stiltner & Manny Moreno from the Deloitte team who support Naval Implementation Framework (NIF) objective owners with moving their effort’s data into Jupiter, leveraging the P2P methodology to help them develop wireframes, dashboards in Qlik, and more.


P2P Team: Can you provide some background on why the Navy is using Jupiter and the DTO’s current NIF data enablement efforts? 

DTO Team: DON's Enterprise Data and Analytics Environment exists as an Enclave within the DoD Advana platform. Advana was adopted primarily to make the DoD financially auditable, which is an effort that has been going on for a long time. The Navy saw the need for the adoption as well. Using Jupiter is a logical choice, while a lot of Naval entities have stood up different cloud environments to do their analytics, this is the first time we are attempting to do enterprise-wide analytics in one environment. Our current efforts center around providing descriptive analytics as we start leveraging Trifacta and Databricks to help develop the Navy’s modeling capabilities. Eventually, we want our analytics effort to work towards something more prescriptive in order to help the Navy make better decisions. There is still more for us to do in terms of setting up how the data shows up and moves around within Jupiter, what data needs to be there and cleaning it up, so it is consumable by Artificial Intelligence and Machine Learning tools that our Data Scientists will use. It’s great to be at the early edge of it with as we dive the art of the possible by looking at data concerning the current areas of attention and improvements to get into the platform and grow the enterprise analytical platform.
 

P2P Team: What are Jupiter’s advantages in its suite of tools that have made the data enablement effort possible?

DTO Team: In order to conduct enterprise-level analytics, exposing your data to the enterprise catalog in Jupiter is very powerful because it enables widespread accessibility to data.
The specific tools we use includes Qlik which is very similar to Tableau in terms of visualization capabilities. Trifacta is available for data wrangling. Databricks also has data wrangling capabilities and a data lakehouse architecture for bringing in and storing raw data. Enterprises can then utilize the medallion process to differentiate raw data from transformed data. For example, if the data is not properly transformed, efforts can bring it in at the Bronze level, and then transform and clean it up and get it to Silver and Gold (Figure 1). Instead of the historic extract-transform-load process, the Databricks lakehouse architecture enables an extract, load and then transform process.

 

Figure 1: Databricks Medallion Process. https://databricks.com/notebooks/delta-lake-cdf.html

 
P2P Team: How is the DTO currently collaborating with the TYCOM/SYSCOMs and what is the DTO’s process?

DTO Team: Right now, we are collaborating with them in terms of the NIF objectives, working day-to-day with the NIF Action Officers. Our effort is very “P2P” in that it’s collaborative and data driven. As we start filling out the wireframes and dashboards, there are a lot of meetings where SMEs are brought together, and everyone has a voice. Our team knows how to facilitate and keep track of things that are going on, but it is ultimately the SYSCOMs who have the expertise and data to make everything work.
 
We follow a two-part framework. For the first part, we focus on solving the right things, getting the right people in the room, and figuring out what we want to answer, so we can go down the right path initially. Once we have identified that path, we move to the second part and work with the Jupiter team on creating and bringing in a use-case. Now that we have an idea of what we want to do, let’s make sure we do it correctly and provide something that is of value to the enterprise. The stakeholders in TYCOMs and SYSCOMs are the key players in making sure we are answering the right questions and identifying the right data to bring in.

 
P2P Team: How can enterprises prepare for the transition to Jupiter?

DTO Team: The key things enterprises should think about are:
  • Start with a mission or business question
    • Use case
  • Identify those data elements that might be responsive to the question
    • What type of data
    • What the enterprise wants to do with the data
  • Identify the systems in which that data is housed
    • Where the data currently exits
Ultimately, Jupiter is not a magical share drive where you can dump data and then expect Jupiter to spit out results. It requires a use case (mission or business) to bring data in so enterprises need understand things about their data such as:

We all also have to recognize; it is going to be complicated. Every organizations and business unit will be in different places with respect to data literacy, analytic acumen, and technology maturity. There’s also going to be an allocation of resources to make sure the data is brought in the right way. Technology is great but it comes down to people’s ability to come in and help us do this in a thoughtful and methodical way.  


P2P Team: What are some common misconceptions of Jupiter?

DTO Team: If you rewind the clock a couple of years, people were reluctant to put information in Jupiter because they felt like they would lose control of the data, and that is not the case. Jupiter is a modern cloud-native application that unlocks a lot of potential if we can overcome the antiquated mindset of not sharing data in order to protect it. In cloud native tools, we can extract data, obscure, and tokenize data and enable others to take advantage of aggregation without exposing the raw data to people who don’t need a certain level of detail or access. Data stewards get to determine how the information will come in and they can control the environment as far as access. It’s understandable to not want to give someone access to data and have them draw the wrong conclusions. We can deploy meta-tagging data and create business rules to avoid that. Through analytics itself, we can control who views the data and there are preventative measures built to manage that data appropriately. Jupiter can bring data in and hold it in for you based on how you think it should be viewed and needed across the enterprise. It is also important to note that the goal is to not migrate the data, but rather to migrate where the enterprise accomplishes analytics. The data should remain in their respective authoritative environments but be exposed to Jupiter to help the enterprise analyze and visualize it.
 

P2P Team: Why can’t the Navy do a big data dump to make the transition to Jupiter faster?

DTO Team: If we tried to bring all the data all at once, it would be impossible to use it. The Navy’s Chief Analytics Officer, Duncan McCaskill is trying to prioritize data migration in a way that makes sense. We want to migrate over time to a modern data architecture that allows for data to be liberated from the systems that create it, making that data available for any authorized user for any authorized purpose. There might only be a subset of data in a large dataset that’s of value to leadership or across the enterprise, therefore, we want to prioritize bringing that into the Jupiter environment first.
The movement to fully leveraging Jupiter in the DON s is a journey. We are at the early starting point for data, analysis, and digital modernization. Currently, we are doing static ingests predominately due to the impediment that most DON transactional systems do not support APIs at this time. So, the process to realizing the full potential of this modern data architecture will take time, patience, and learning. As we bring in the data and start building connections, we continue to learn more about the Jupiter platform and how the tools within Jupiter interact with the data, which allows us to get it right before scaling it out. A giant data dump would take years to clean up but now we can pick and choose what comes in as we need it, and then we can start building upon lessons learned along the way.  


P2P Team: How does the NIF data enablement effort complement P2P?

DTO Team: The way the data enablement has been framed out by the CNO and VCNO, in how they want to progress and evolve this effort, is very P2P centric. More specifically, we want to develop Driver Trees, we want to understand the performance drivers and what outcomes we’re driving towards so we can perform to a plan, which is the essence of P2P. That is the same model and methodology that we’re bringing into a large portion of elements within the NAVPLAN. Like P2P, the NIF data enablement effort is about understanding what it is that’s actually driving issues and problems, how do we resolve those, how can we create plans and milestones to meet readiness goals.
 

P2P Team: How are the NIF dashboard visualizations structured to support decision making?

DTO Team: Right now, the DTO’s strategy is wireframing personas, building out the prototype, gathering feedback and iterating on that process. We are making visualizations that resonate with the decision maker. We can make a VCNO dashboard, a TYCOM dashboard, and a Program Manager dashboard, leveraging and pivoting off the same data. To create the different visualizations, we first try to build out personas, understanding what set of individuals would be interested in what information, and try to distinguish differences between them. We are not building a dashboard that tries to answer everything for everyone, because at that point it’s not answering anything for anybody. We want to tailor the visualizations and back the Driver Tree metrics with authoritative data so someone at the deckplate can use the dashboard for decision-making and identify a problem before it hits them.
 

P2P Team: We heard that you recently did a demo of the NIF R1 dashboard (Figure 2) for VCNO. What sort of feedback did you receive?

DTO Team: VCNO liked the construct and gave us some positive and constructive feedback on what he would like to see. We can do a lot in Qlik in terms of visualizations, but we would like to avoid doing a tremendous number of customizations because when software updates come out, sometimes those customizations can break. We try to do as much as we can with the out-of-the-box functionality so the dashboards can smoothly take the software updates as our analytics vendors continue to update the platform with new capabilities and patches. However, we can customize the interface based on who we’re presenting to. The dashboards are meant to be interactive and allow you to see the data different ways. At leadership briefs, enterprises can present from the live dashboard and drill down on the data which is very powerful and exactly where the VCNO wants the Navy to go.

 

Figure 2: NIF R1 Objective Dashboard


P2P Team: How is the DTO able to carry out this large-scale effort?

DTO Team: There are a lot of people and organizations who are supporting this effort. There are the Secretariat SMEs, TYCOMS, SYSCOMS, Fleet SMEs, Fleet Readiness Centers, CNA, and other folks who bring in models and support that enables our effort. The DON CIO, CAO, CDO, and VCNO in his Strategic Innovation Group (SIG) actively help us remove barriers. DNS has also added resourcing to help facilitate and bring the NIF action officers into the conversation. Further, we collaborate with DON CDO/CAO and PEO MLB to ensure that we stay aligned with and support their data architecture plans and visions.
 
What the DTO is doing is just one portion of the Navy’s larger data enablement effort which exists to help us make decisions faster, fix small problems while they’re small, and simplify big problems. Big problems take up big resources. With data enablement, we can solve such problems at much lower levels as people start to become more informed. This is a very collaborative effort that’s bringing a lot of different SME’s and organizations to solve the right things, and solve them right.