Thursday, 7th of July, 2022.
1pm BST | 2pm EET | 3pm KSA
Helping your organization navigate unexpected events and turbulent business conditions can’t be done with static plans, inflexible reports or obsolete data. The speed of change today is the biggest catalyst for FP&A teams to implement continuous and adaptive planning and forecasting and respond to rapid change.
In part one of this 3-part webinar series learn how FP&A teams are using Incorta to combine operational and financial data to improve accuracy of scenario planning, quickly adjust forecasts based on the latest business trends, and expand planning agility company-wide to increase business resiliency.
1
00:00:07.200 --> 00:00:18.150
Hello everyone, and thank you for joining us today for Part One of our three part series on driving agility with financial analytics building resilience with agile and adaptive continuous fpa.
2
00:00:18.750 --> 00:00:26.880
Before we get started today on a few housekeeping items, if you do need to leave early, we will be making this webinar available on demand at and quarter.com.
3
00:00:27.150 --> 00:00:37.680
And you'll receive a link to access it within a couple of days I should you have any questions, please feel free to type them into the chat we will be having a Q amp a session, towards the end of the webinar.
4
00:00:39.120 --> 00:00:52.290
My name is artistry gamblers out i'm director of product marketing here at encarta joining us today is also Ryan Garrett senior sales engineer at in quarter Ryan will be doing a DEMO of the analytics hot for finance later in the webinar.
5
00:00:54.120 --> 00:01:01.320
So a little bit about what we'll be covering today I will take a look at some of the challenges that the fema teams are facing for analytics and reporting.
6
00:01:01.860 --> 00:01:11.460
will compare the modern and agile approaches to data architecture next we'll take a look at the benefits of a unified approach to planning and operational data.
7
00:01:12.240 --> 00:01:20.520
Ryan, will give a DEMO Odin quarters analytics help for finance and, finally, we will answer your questions in the Q Q amp a session of the webinar.
8
00:01:22.560 --> 00:01:28.680
So we'll begin by taking a look at some of the key challenges facing fema teams on some of the things that we are seeing.
9
00:01:29.220 --> 00:01:38.910
In terms of trends around reporting requirements is there is a migration to real time insights and the demand by business to to drive.
10
00:01:39.570 --> 00:01:48.480
Data collection and reporting in that direction, there is a need to have insights now on a daily basis, or even sometimes an hourly basis.
11
00:01:49.080 --> 00:01:56.400
There is a desire to have answers for questions that are being immediately asked in a very short time and short window.
12
00:01:56.790 --> 00:02:06.360
For answers, this means that, having data readily available and easily consumable is going to become very critical to day to day operations and financial decision making.
13
00:02:07.290 --> 00:02:17.640
Another trend and another challenge that we see is visibility to not only top line data which is aggregations of data but also transactional level detail.
14
00:02:18.360 --> 00:02:27.960
While oftentimes there is visibility to the top line aggregations there is not a lot of times availability availability of data to be able to drill down and get into.
15
00:02:29.280 --> 00:02:36.480
a lower level transactional level details that are going to provide the level of granularity that is needed to be able to.
16
00:02:37.080 --> 00:02:44.550
accuracy to be able to accurately identify whether aggregations are correct, whether adjustments need to be made.
17
00:02:44.820 --> 00:02:53.430
oftentimes to find patterns or identify root causes that are creating variances, for example, and forecasts, especially in times, where there's volatility.
18
00:02:53.970 --> 00:03:05.190
And variances Why then this becomes pretty important to have that level of granularity to identify what the drivers are behind those changes and variance so um this this level of.
19
00:03:05.670 --> 00:03:17.430
detail in transaction becomes pretty critical and also has a application when you look at things like advanced analytics use cases such as using machine learning to generate forward looking.
20
00:03:17.790 --> 00:03:27.930
Understanding with predictive or prescriptive analytics another key area of a challenge is the the velocity and volume of data, so there is.
21
00:03:28.560 --> 00:03:35.100
exploding volumes of operational data right now happening in business and it's being collected, but not all of it is being leveraged.
22
00:03:35.520 --> 00:03:40.950
To help produce rich analytics or do meaningful analysis to help with planning and forecasting.
23
00:03:41.670 --> 00:03:54.690
also add the in most businesses, there are multiple business applications your PS custom sources of data always have you know valuable data in them, but not always accessible.
24
00:03:55.380 --> 00:04:08.400
And certainly not accessible at a granular level so that's another challenge the data is out there, but not always available to fba teams to be able to use that and leverage it for decision making and insight generation.
25
00:04:09.090 --> 00:04:18.720
And finally, we often talk about a single source of truth and What this really means is i'm building a common and trusted environment.
26
00:04:19.230 --> 00:04:25.260
That you can leverage to generate insights and then drive some of those incremental analysis across the business line.
27
00:04:25.950 --> 00:04:35.850
So we want to be able to bring data into a single place so that you enable cross functional teams to work with finance teams and fba teams and everyone in the ecosystem of finance.
28
00:04:36.150 --> 00:04:47.790
But do that using the same set of data for analysis and and then extending that beyond just those functions and pushing that out to the rest of the organization.
29
00:04:48.330 --> 00:04:52.410
So that there is there's a trusted accuracy behind.
30
00:04:52.860 --> 00:05:08.520
what's happening with planning what's happening with consolidation reporting and anything else that's used in the tools within the business So these are some of the key challenges that we find exist today for fema teams and in the ecosystem of office of finance.
31
00:05:10.980 --> 00:05:28.140
But one of the concepts that is out there today is the concept of the modern data architecture, but this is somewhat of a source of friction for for the office of finance, because of the flexibility and the limitations and it puts around data.
32
00:05:29.280 --> 00:05:37.290
and access to data, so in this process, the way it works is the first step is moving data from some of the business sources that are out there, for example, whether it be.
33
00:05:38.160 --> 00:05:45.720
Any rp Oracle database, or in any other business application of business system that houses data that is necessary for.
34
00:05:46.170 --> 00:05:55.770
Making business decisions extracting data from that source putting it into a raw data zone and then from that raw data zone, there is a.
35
00:05:56.340 --> 00:06:03.120
level of a lot of transformation that needs to happen to move this data out into a refined data zone.
36
00:06:03.480 --> 00:06:10.680
And then, this and this level of transformation actually starts to strip away at some of the details and the fidelity of.
37
00:06:11.190 --> 00:06:19.770
That data is coming from the from the source, furthermore, this data can is then more there's more reshaping and changing of this data.
38
00:06:20.280 --> 00:06:27.840
dawn throughout this process to land it into a business data is all where it can be used by some of the common tools and applications in the office of finance.
39
00:06:28.590 --> 00:06:39.420
Even some of the data discovery tools for example and and eventually the reaching the consumer of the data, whether it be the fema teams era P teams.
40
00:06:39.990 --> 00:06:43.050
or teams involved in the in the claws process.
41
00:06:43.500 --> 00:06:56.070
The disadvantage of this approach is that much of the transactional detail and the fidelity of the data is lost through this transformation process, so the very information that is needed to do things like root cause.
42
00:06:56.400 --> 00:07:04.410
or verifying accuracy of aggregations is no longer available to the end user, but simply stuck with those aggregations and the subset of data.
43
00:07:04.890 --> 00:07:14.820
That was transformed from the original source and delivered to them So how do you solve this problem of loss of fidelity of data, the lack of transactional details.
44
00:07:15.450 --> 00:07:30.450
To do that would cause analysis and kind of frictions that this creates when you're moving data from the raw source or different different sources but you want to be able to do this with with speed and agility so let's uh let's take a kind of a quick look at that.
45
00:07:32.970 --> 00:07:45.630
So in quarter actually kind of takes a different approach here by going directly to the source of the data in the business again, whether that be an aarp and operational management system or other business application.
46
00:07:46.650 --> 00:08:01.020
That using the organization we're able to take all of that data and combined it across multiple sources into a central hub, in a way to add contacts and be able to.
47
00:08:01.860 --> 00:08:10.050
enable end users to generate actionable insights in this case, though, the difference is that there's not a lot of transformation being done here.
48
00:08:10.920 --> 00:08:26.160
Eliminating the need for these transformations that make you lose the fidelity of data that you had and the visibility to the transaction level details means that you're making 100% of that data from the source available.
49
00:08:26.940 --> 00:08:34.710
In 2d and users and in a way that works with your existing tech stack and once that data is on the platform.
50
00:08:35.310 --> 00:08:38.640
One of the things that we can do is, we can use in quarters blueprints.
51
00:08:39.210 --> 00:08:51.210
To significantly accelerate implementing reports and dashboards that can also work in concert and side by side with other reporting tools that you have today.
52
00:08:51.960 --> 00:09:01.380
Also visualization tools such as tablo excel power bi and even provide data to machine learning applications for training algorithms.
53
00:09:01.980 --> 00:09:11.640
That can be used and for predictive predictive analytics Ultimately, this is going to provide a lot more flexibility for fema teams and the entire ecosystem.
54
00:09:12.240 --> 00:09:20.040
or around the office of finance, but forecasting and planning and bringing some agility to the ap and our process.
55
00:09:20.430 --> 00:09:29.880
To get better visibility on how to manage those dollars that are coming in and dollars that are going out and also help the teams that are supporting the closed process shrink down.
56
00:09:30.390 --> 00:09:41.100
That that's close cycle, making them more agile in in delivering what they need to the to the business by by having that visibility and access to all of the data.
57
00:09:44.730 --> 00:09:53.040
And now taking a closer look at it and finance teams and kind of the face kind of the pain points that these two teams face.
58
00:09:54.180 --> 00:09:59.490
And encounter on a daily basis when they're trying to accomplish what they need to enable the business.
59
00:09:59.790 --> 00:10:07.050
Finance teams are trying to do one thing really well built fast data pipeline, so that the business users can get the data they want when they want it.
60
00:10:07.680 --> 00:10:15.990
However, this is a pretty complex process, while we're in quarter comes into enable these teams is by simplifying the process.
61
00:10:16.470 --> 00:10:22.620
for getting that data from the source and making it available to the end users, so that they can generate insights and take action.
62
00:10:23.040 --> 00:10:35.220
while at the same time, the business users are often hampered by not having enough data or data they need, they are enabled by getting immediate access to to that data.
63
00:10:35.820 --> 00:10:48.990
At the detail level, and then they can operate in those very short windows, where they have to give give the business quick answers questions that are coming up and they're not repetitive questions from the previous week or a previous month.
64
00:10:50.100 --> 00:10:59.160
Or the previous quarter the benefit here is that finance teams and the business users, such as the fema teams are saving time they're becoming more efficient.
65
00:10:59.850 --> 00:11:10.770
At the delivery and consumption of business data, and they are able to provide commentary to the business to be able to drive decision making and move on move decisions forward.
66
00:11:13.020 --> 00:11:21.630
So with this consideration of these different pain points that both the the finance and it teams and the ecosystem of the office of finance are facing.
67
00:11:22.200 --> 00:11:32.520
here's some of the exciting quarters unique value propositions that that help strike essentially a balance between the organizations and reduce that level of friction.
68
00:11:33.300 --> 00:11:42.870
and bring benefit to both So how do we do that we provide unrivaled data access by simplifying how data is sourced from multiple business systems.
69
00:11:43.200 --> 00:11:54.120
and bringing a hard percent data hundred percent of that data to to to business for analysis and enabling those business users to explore that data asked questions.
70
00:11:54.930 --> 00:12:05.220
and determine by drilling into the details, whether the data is accurate and make adjustments to their forecasts or make adjustments to their.
71
00:12:05.550 --> 00:12:20.040
scenario plans according to 100% of the business data, and not just only aggregations will also provide an environment where there's a trusted secure and accurate level of.
72
00:12:20.940 --> 00:12:27.120
source data available to both teams, while giving the the business user.
73
00:12:27.480 --> 00:12:42.210
The complete fidelity to access all the data, but at the same time giving the it teams complete control over the security and data governance, which is really important, and obviously to them, but at the same time enabling the business users to still.
74
00:12:43.860 --> 00:12:50.100
explore that data and generate insights and drive decisions and, finally, one of the things that.
75
00:12:50.580 --> 00:13:01.110
That is key here is that eliminating the the transformation and we shaping of data and all the way from the source and bring it to the dashboard for reporting and analysis, this gives you.
76
00:13:01.620 --> 00:13:10.110
far faster time to insights than other solutions, including solutions such as traditional and legacy bi solutions.
77
00:13:13.200 --> 00:13:24.150
One of the customers who has had great success with within quarter solution is a very commonly known name in the in the technology industry.
78
00:13:24.540 --> 00:13:31.650
This company is called broadcom date design and manufacturer semiconductors and solutions around semiconductors they.
79
00:13:32.400 --> 00:13:41.010
They by implementing in quarter, they were able to actually see significant improvements and how their business intelligence and reporting.
80
00:13:41.610 --> 00:13:55.320
Improved they saw a 96% increase in the rows of available data, and they were able to access that within four seconds, which is incredibly fast and amazingly beneficial to their team for.
81
00:13:55.890 --> 00:14:07.650
visibility to data and speed of decision making, they they were able to expand from only a handful of refreshes a day to up to 96 refreshes per day.
82
00:14:08.400 --> 00:14:20.160
To ensure that they have the latest data available for their end users to consume do analysis to planning and and share that with the organization.
83
00:14:21.000 --> 00:14:27.330
And then, they were also able to see a significant reduction 50% reduction and bi staffing.
84
00:14:28.050 --> 00:14:39.900
And, which obviously helps their bottom line and and, in addition to that they were able to use this to de risk the business and protect something close to $12 billion in market CAP.
85
00:14:40.650 --> 00:14:59.190
For for broadcom so quite a lot of financial and operational benefits as a result of this implementation realized by by a broadcom, which is a which has been a quarter customer and has seen quite a lot of success over the years from this implementation.
86
00:15:00.270 --> 00:15:02.370
With that um I wanted to.
87
00:15:03.420 --> 00:15:13.440
hand over the the realm to Ryan, to give you a DEMO of in quarter the analytics for finance so Ryan take it away.
88
00:15:16.020 --> 00:15:29.880
Thank Sarah sure, and what i'll do is i'll walk you through the quarter platform or i'll connect to a data source and this example will connect to an oracle data source typically we see customers connecting to complex.
89
00:15:31.320 --> 00:15:40.470
source systems such as Oracle SAP etc will bring that data into in quarter, we will apply a blueprint, so you can see how and quarter can help you.
90
00:15:40.950 --> 00:15:49.320
expedite the understanding of how those tables relate to each other how they're joined how to actually get usable data out of out of the Platform.
91
00:15:49.680 --> 00:15:57.180
And then we'll take a look at the semantic layer as well, so how do we kind of apply friendly names to it, how do we get that data ready.
92
00:15:57.690 --> 00:16:04.170
consumable for the business and then what i'll do is i'll walk you through that scenario where we have you know.
93
00:16:05.010 --> 00:16:08.430
High Level transaction detail and sorry high level details.
94
00:16:08.940 --> 00:16:16.170
Better accounts receivable and we can actually drill in there to artists or his point is kind of drill down into transaction level detail to.
95
00:16:16.440 --> 00:16:24.540
Really uncover what's going on and really get those insights that we need as a as an FP amp a team to really make critical business decisions.
96
00:16:24.990 --> 00:16:30.630
So let me walk you through that platform now So the first thing is we're going to do is we're going to connect to a data source.
97
00:16:31.170 --> 00:16:40.980
So i'm going to connect to an oracle data source, so in this case will connect Oracle it up a username and a password we can connect that connection, make sure that we're connected there.
98
00:16:41.640 --> 00:16:48.240
Once we brought that you're connected to that data we can actually bring that data into the platform within in quarter we call that a schema.
99
00:16:48.840 --> 00:16:55.770
So you can bring that into a schema here and what i'll do is i'm going to take a look at this as counts receivable schema for us.
100
00:16:56.100 --> 00:17:02.610
So, as we bring that data into the platform, we can load that data, either as full or incremental bring it into importer.
101
00:17:03.150 --> 00:17:09.060
But some of the key things that we incorporating and really to shorten that time to value.
102
00:17:09.660 --> 00:17:23.610
For fema teams is, we can apply blueprints to this data so known data sources, you know all of the different modules within Oracle so you know you know general ledger accounts receivable accounts payable etc.
103
00:17:24.240 --> 00:17:37.620
In quarter has extensive blueprints for those as well as many more, and we can bring that data in apply those blueprints and now get data that is you know joined combined.
104
00:17:39.480 --> 00:17:46.530
across the different the different tables, so now we can actually take a look at this and say okay i've brought.
105
00:17:47.130 --> 00:17:53.250
You know accounts receivable data in we've been able to join those with the dates the transactions lines.
106
00:17:54.180 --> 00:18:04.860
This the payment schedules, the ar transactions so we've brought all of that data in we've created that metadata map and those joins within the quarter platform.
107
00:18:05.520 --> 00:18:13.890
Now this dramatically reduces the time it takes for customers to get into their data and start making.
108
00:18:14.760 --> 00:18:23.970
You know, decisions based on what's their data what we hear most often is people spend 80 plus percent of their times just trying to get data.
109
00:18:24.210 --> 00:18:39.960
In a usable fashion, how do I join this to that, how do I, you know extract data out of this space, how do I bring this over into to this other tool and there's a lot of manual process that goes to even to getting an understanding of workable data sets.
110
00:18:41.100 --> 00:18:51.840
Now from there, we actually take this a step further and we say Okay, now that we've got your your model within and quarter and we've joined all of those data, and you can.
111
00:18:52.260 --> 00:18:59.520
You know, continue to join additional as well, so if you want to create your own joins you can do that, within a couple of clicks.
112
00:19:00.180 --> 00:19:12.240
And then the next piece is okay um let's get that data, you know consumable and ready for the business so within a quarter we call it a schema layer but think of that as your semantic layer
113
00:19:12.690 --> 00:19:18.540
And for us we're going to go take a look at an accounts receivable schema that we've already created here.
114
00:19:18.930 --> 00:19:28.200
So now, you can see the the the data in it's much more business consumable fashion, so you can see the Nice friendly names here.
115
00:19:28.650 --> 00:19:35.550
We can actually see the source column, where this comes from, so this actually helps improve the the trust for your data.
116
00:19:36.180 --> 00:19:40.920
One of the things we hear a lot from customers is that they they don't trust the data they don't know where it comes from.
117
00:19:41.490 --> 00:19:49.620
In a lot of cases it's come through three, four or five different tools before it's gotten to the hands of an FP amp a professional.
118
00:19:50.250 --> 00:19:57.090
So here, you can actually see very quickly, where those have come from now, I can also go and add additional.
119
00:19:57.750 --> 00:20:11.700
fields to this, so the the business schema here, so I can go take an add other columns from other data sources, I can add fields I can add calculated fields I can bring all of that data into the schema.
120
00:20:12.060 --> 00:20:21.300
And with a couple of clicks I can make all the adjustments that I need So if I wanted to say, you know, this is my account type and i'm going to create a new account type name.
121
00:20:21.630 --> 00:20:31.020
I can do that within within a quarter or I can go create the formulas, I need So if you need a quick average four month formula if I need like a you know.
122
00:20:31.590 --> 00:20:36.300
I can create a quick average formula bringing that on drag that onto the canvas again without.
123
00:20:36.810 --> 00:20:47.490
Creating a whole bunch of code are relying on multiple teams and tools and technologies we're now getting this data, to the point where it's you know ready consumable for analysis for visualizations.
124
00:20:48.060 --> 00:20:56.040
For predictive analytics etc now when I go and take a look at this, I can now explore the data within in quarter.
125
00:20:56.760 --> 00:21:05.160
I can explore that data, and I can see all of that a you know nice friendly names easy for me to consume and I can start to build out.
126
00:21:06.060 --> 00:21:16.170
You know the the insights that I want to see so in this example, let me bring on in the organization ID let me bring on the quantity invoice if we're trying to see.
127
00:21:17.010 --> 00:21:24.030
You know our accounts receivable so I can go do that within in quarter, and you can see very quickly within a couple of clicks i've now brought in.
128
00:21:24.300 --> 00:21:37.290
You know these organizations and the amount that they've influenced so that's the first path is you can explore that, and you know extend your data curiosity very quickly simply very easily or the other path as well is, we can take.
129
00:21:38.010 --> 00:21:50.880
Pre built content pre built reports that include a has that maps to you know these modules within Oracle to really drive that efficiencies and getting you to insights a lot faster, so let me take a look at the.
130
00:21:51.300 --> 00:21:58.740
The content here, so you can kind of see there's some examples here have pre built content that we're looking at and if I go into the EBS side.
131
00:21:59.340 --> 00:22:08.190
You can actually see we have quite a pre build a lot of pre built content now if I look at this and for the example here, we want to walk through is you know.
132
00:22:08.670 --> 00:22:11.040
i'm a financial analyst and i'm looking at.
133
00:22:11.550 --> 00:22:23.700
You know what's going on quarter and close and I need to be able to you know figure out what what transactions are how they've made you know what's missing etc so let's walk through that cycle now so when I look at this, I can actually go into.
134
00:22:24.450 --> 00:22:30.780
You know, a cash cycle summary and analysis again, these are some of the out of the box content that in quarter provides as part of our blueprint.
135
00:22:31.080 --> 00:22:36.150
And you can see a lot of really great information there receivables to revenue inventory to revenue.
136
00:22:36.480 --> 00:22:43.680
etc, you can see a lot of the trends that we're looking at as well, but if I go down here and I take a look at you know our cash flow.
137
00:22:43.950 --> 00:22:50.610
And I see one of the things that I see as I see there's this change and receivables and I look at that and I say okay that's.
138
00:22:51.150 --> 00:22:58.890
that's not what we're expecting right there's a there's a transaction here that's not aligning we're not able to close those books until we can account for that.
139
00:22:59.280 --> 00:23:10.950
So now, I can take a look at this and I can actually drill into this and I can go into how we're actually accounting for that accounts receivable, so now we can actually take a look at at collections.
140
00:23:11.430 --> 00:23:15.120
You here as well, so the outstanding balances overdue all that good stuff.
141
00:23:15.540 --> 00:23:24.240
But I drill into this and I take a look, and I can see, you know our top customers what's going on, but I also see some of these anomalous transactions that are going here.
142
00:23:24.960 --> 00:23:33.210
When I look at these these are kind of standing out for me so i'm looking at this and saying okay we've you know done business with Amazon and for whatever reason.
143
00:23:33.990 --> 00:23:42.420
we're not getting the correct balance here, so we want to drill into that a little further now in the traditional sense.
144
00:23:42.780 --> 00:23:48.630
A lot of cases this is, you know extracts and manual process and dumping stuff into excel and you know, trying to get.
145
00:23:49.260 --> 00:23:53.460
Data from multiple different tools to go and get that transaction level detail.
146
00:23:53.790 --> 00:24:00.960
Now, within a quarter this becomes a very simple process, because we can actually drill into this and drill into the transaction level of detail.
147
00:24:01.230 --> 00:24:05.940
And i'll go over here to the accounts receivable transaction level detail for us to take a look at this.
148
00:24:06.270 --> 00:24:16.650
And so, now we can take a look at across all of our environment across all of the transactions that we've made and we're looking at you know, almost a half billion records worth of data.
149
00:24:16.980 --> 00:24:20.850
So when I look at that I can actually drill into this and we remember from the previous thing.
150
00:24:21.660 --> 00:24:30.300
We were looking at Amazon, and so I can now drill into all of the transactions that we've done with Amazon, and you can see, we filtered down from.
151
00:24:30.630 --> 00:24:41.130
You know, half a billion records to just under 404,000 records in a matter of a couple of seconds, so now we can actually take a look at the transaction level detail.
152
00:24:41.610 --> 00:24:49.980
From you know that high level summary metrics, if you remember, we started with you know the the accounts receivable started with the collections.
153
00:24:50.340 --> 00:25:06.450
And then drill all the way through to the transaction level detail we've done that, all in a single platform we've done that, without writing extensive you know code we've done that in a point click environment no code low code and we've done that very quickly and efficiently.
154
00:25:07.620 --> 00:25:21.030
within a single platform, and this is really the value that we see that in quarter can provide our customers is being able to provide that finance and data analytics hub for fema professionals.
155
00:25:21.600 --> 00:25:34.140
And so, let me just recap here and remind you what we've done we connected to a data source We brought the data in we applied the blueprints to easily combine that data join that data.
156
00:25:34.980 --> 00:25:51.870
We we made that data available in a semantic layer, and then we also made that data available for visualization and then from the visualization we went all the way from top line revenue, all the way through to to transaction level detail within a few clicks with a single platform.
157
00:25:52.950 --> 00:25:56.550
And that is in quarters approach to a finance analytics hub.
158
00:25:57.840 --> 00:25:58.920
harder sure back to you.
159
00:26:07.200 --> 00:26:21.360
All right, Ryan Thank you so much for that great DEMO much appreciate it up so folks just you know and wrap it up, you know kind of the the takeaways from this webinar today the the three things that are really important when you are working.
160
00:26:22.650 --> 00:26:28.470
Around the data for analytics for financial planning and analysis for reporting is.
161
00:26:28.950 --> 00:26:41.550
One having access to all the available data, and having it in an experience fashion, is going to be critical to developing accurate forecasts and and being able to plan out for the organization, the short and long term.
162
00:26:42.690 --> 00:26:48.510
Why, we need to go beyond just aggregations and have visibility like Ryan showed in the DEMO.
163
00:26:49.230 --> 00:26:58.140
too detailed data detail transaction data being able to drill down at the granular level, to make sure that the aggregations are accurate, or to.
164
00:26:58.560 --> 00:27:06.510
Do to study the the data to understand root cause and identify problem areas and, finally, with the the current.
165
00:27:06.840 --> 00:27:19.770
and accurate data in hand, you can actually generate insights that are going to really help your organization make better decisions and navigate the the what we have today, which are quite volatile business and economic conditions.
166
00:27:20.670 --> 00:27:32.670
With that said we'd love to take some of your questions, so if you have a question finally type that question into the chat and we'll do our will do our best to answer them for you.
167
00:27:35.790 --> 00:27:48.480
Ryan question for you, we alluded to blueprints in the presentation one part where I was talking about it and you mentioned the out of the box blueprints that we have in the quarter platform, can you explain a little bit what they are.
168
00:27:49.710 --> 00:27:56.580
yeah absolutely um so blueprints is in quarters approach for pre packaged analytics applications.
169
00:27:57.090 --> 00:28:09.810
What this means is we're spending, you know our engineering effort to go and understand those really complex or systems, you know the oracles of the world, the SAP systems and being able to.
170
00:28:10.620 --> 00:28:17.340
You know figure out what that known good state is, you know how those tables relate to each other how they combine with each other.
171
00:28:18.420 --> 00:28:20.910
and be able to apply that into a packaged.
172
00:28:22.020 --> 00:28:23.490
easy to deploy.
173
00:28:24.810 --> 00:28:28.320
analytics application so we bring that data bring that into in quarter.
174
00:28:29.940 --> 00:28:41.250
The joins the complex joins all that great stuff comes in, as well as the business sorry the the the schema layer as I showed you the joins all that good stuff the business layer
175
00:28:41.820 --> 00:28:54.060
You know, friendly names usable columns great as well as the visualizations so we take all of that, and we combine that into a single application that our customers can deploy very easily.
176
00:28:54.450 --> 00:29:02.700
And what we're seeing with customers is this helps take out many months, out of the oven analytics journey we have one of the large.
177
00:29:03.600 --> 00:29:10.830
Coffee retailers that's an in quarter customer and they've been working with in quarter now for several several several years.
178
00:29:11.160 --> 00:29:19.620
But when they originally had thought about kind of modernizing their data architecture, they had planned a 12 to 18 months process.
179
00:29:19.890 --> 00:29:29.250
Within quarter and with the leveraging our blueprints and leveraging the quarter technology we're able to take that down from 12 to 18 months into 10 weeks.
180
00:29:29.640 --> 00:29:42.810
So, think about those blueprints is really that shortcut into getting data, out of a complex system and into into the hands of your fpga people for for for rapid analysis and rapid insight.
181
00:29:44.670 --> 00:29:53.220
awesome, thank you for that explanation another question here is more of a process question how do we compile it teams.
182
00:29:53.910 --> 00:30:06.420
To spend more time gathering data for for FP amp a so, so this is where this is the area where in quarter kind of creates that balance between it and finance where.
183
00:30:07.140 --> 00:30:17.250
We are able to enable the it teams to collect data extract data from the different sources within the within the business.
184
00:30:17.700 --> 00:30:22.050
bring them all into the analytics top and centralized them in a common data environments.
185
00:30:22.710 --> 00:30:32.430
and give access to the business users in that single data hub, and to do that at a granular level without the transformation and reshaping of data so.
186
00:30:33.090 --> 00:30:42.660
So this makes the process a lot easier for them, it brings the users to the data empowers the users to actually make changes to.
187
00:30:43.260 --> 00:30:49.020
The reports that they want to make without having to go through the queue of it to make sense, for example, add a column.
188
00:30:49.530 --> 00:30:54.180
or or make it or make a change to how a report is updated or delivered.
189
00:30:54.840 --> 00:31:02.580
Empowering the end users and enabling it to be able to deliver all of the data to them Is this how.
190
00:31:02.910 --> 00:31:12.180
We how in court is able to kind of create that right balance between the needs of the finance teams and the complexities that it has to go through.
191
00:31:12.630 --> 00:31:26.640
In order to be able to collect data from different business sources, whether it er peas or other business applications and and centralized a secure that and a single location, so that the business users can have immediate access to them.
192
00:31:33.780 --> 00:31:43.920
Ardeshir Ghanbarzadeh: Alright folks just wanted to answer one more question here and there was a question here, along the lines of SOx compliance.
193
00:31:44.610 --> 00:31:49.530
Ardeshir Ghanbarzadeh: And and using Oracle as a data source, so one thing I just wanted to reinforce was the fact that.
194
00:31:50.190 --> 00:31:57.450
Ardeshir Ghanbarzadeh: With it within quarter you're able to pull in data from multiple sources so bringing in data from your peers and other.
195
00:31:58.110 --> 00:32:03.570
Ardeshir Ghanbarzadeh: Other operational systems is something that you can bring into that that finance data hub.
196
00:32:04.470 --> 00:32:13.710
Ardeshir Ghanbarzadeh: The than the question around the SOx compliance i'll have a better question after the team to see if the was implemented in quarter in a SOx compliant environments.
197
00:32:14.310 --> 00:32:23.910
Ardeshir Ghanbarzadeh: But I can tell you that the fact that in quarter allows you to bring 100% of your rp and operational data into the.
198
00:32:24.420 --> 00:32:31.650
Ardeshir Ghanbarzadeh: into the analytics data help or finance that provides the end users, the ability to be able to drill down.
199
00:32:32.310 --> 00:32:40.800
Ardeshir Ghanbarzadeh: To the transactional level details that essentially is the way to be able to support the audits and verifications of accuracy.
200
00:32:41.460 --> 00:32:46.530
Ardeshir Ghanbarzadeh: That is needed when it comes to drilling into aggregated data to make sure that the underlying.
201
00:32:47.250 --> 00:32:59.610
Ardeshir Ghanbarzadeh: Underlying accounts are are accurate, in terms of the in terms of the rolling up of the data so um so you know we'll get back to you on the on the implementation around SOx compliance, but I can tell you that.
202
00:33:00.210 --> 00:33:10.950
Ardeshir Ghanbarzadeh: The fact that that in quarter brings 100% of data in provide that level of granular visibility that most auditors are looking for when it comes to ensuring accuracy and compliance.
203
00:33:12.510 --> 00:33:32.790
Ardeshir Ghanbarzadeh: That is all the questions we have today folks be sure to join us on the second of August at the at 12pm utc for the second part of our three part series on driving agility with operational and financial analytics Thank you again for being here today and have a great rest of the day.
Director, Product Marketing