Complete Intelligence

Categories
News Articles Uncategorized

Startup makes superforecasting possible with AI

This article originally published at https://blogs.oracle.com/startup/startup-makes-superforecasting-possible-with-ai on December 1, 2020.

 

 

Here’s a mathematical problem: The sum of all the individual country GDPs never equals the global GDP. That means forecasting models are flawed from the start, and it’s impacting global supply chain economics in a big way. Entrepreneur Tony Nash found that unacceptable, so he built an AI platform to help businesses “understand the sum of everything” through a highly automated, globally data-intensive solution with zero human bias.

 

Complete Intelligence, Nash’s Houston-based startup, uses global market data and artificial intelligence to help organizations to visualize financial data, make predictions, adjust plans in the context of a global economy, all on the fly. The globally-integrated, cloud-based AI platform helps purchasing, supply chain planning, and revenue teams make smarter cost and revenue decisions. It’s a way on how to make better business decisions.

 

“The machines are learning, and many times that has meant deviating from traditionally held consensus beliefs and causality models,” said Nash. “Causal beliefs don’t hold up most of the time—it’s human bias that is holding them up—our AI data is reducing errors and getting closer to the truth, closer to the promise of superforecasting.”

 

 

Massive datasets across 1,400 industry sectors

More than 15 billion data points run through the Complete Intelligence platform daily, making hundreds of millions of calculations. Average business forecasting saas software models use 10-12 sector variables. Complete Intelligence, on the other hand, examines variables across 1,400 industry sectors. The robustness gives businesses insights and control they didn’t have before.

 

“We’ve seen a big shift in how category managers and planning managers are looking at their supply chains,” said Nash. “Companies are taking a closer look at the concentration of supply chains by every variable. Our platform helps companies easily visualize the outlook for their supply chain costs, and helps them pivot quickly.”

 

 

Superforecasting brings a modern mindset to an old industry

 

Australia-based OZ Minerals, a publicly-traded company, is a modern mining company focused on copper with mines in Australia and Brazil. OZ says their modern mantra is more than technology, it’s also a mindset: test, learn, innovate. They wanted to better navigate and understand the multi-faceted copper market, where the connectivity between miner, smelter, product maker, and consumer is incredibly complex and dynamic. They turned to Complete Intelligence.

 

“I need a firm understanding of both fiscal and monetary policies and foreign exchange rates to understand how commodity prices might react in the future because a depreciating and/or appreciating currency can impact the trade flows, and often very quickly, which might influence decisions we make,” said Luke McFadyen, Manager of Strategy and Economics at OZ Minerals.

 

“Our copper concentrate produced in Australia and Brazil may end up being refined locally or overseas. And then it is turned into a metal, which then may be turned into a wire or rod, and then used in an electric vehicle sold in New York, an air conditioner sold in Johannesburg, or used in the motor of a wind turbine in Denmark,” he explains. “The copper market is an incredibly complex system.”

 

With Complete Intelligence, McFadyen has a new opportunity to test for a bigger-picture understanding and responsiveness. Previously, he updated his models every few months. Now he could do it every 47 minutes if he needed to.

 

McFadyen points to the impact of COVID-19 as a “Black Swan” event that no business forecasting saas software could have predicted, but is nonetheless impacting currencies, foreign exchanges, and cost curves throughout global copper market and supply chains.

 

“If your model isn’t dynamic and responsive in events like we are experiencing today, then it is not insightful. If it’s not insightful, it’s not influencing and informing decisions,” he said. “Complete Intelligence provides a different insight compared to how the traditional price and foreign exchange models work.”

 

McFadyen says early results have reflected reductions in error rates and improved responsiveness.

 

 

Cloud power and partnership

 

Complete Intelligence needed a strong technology partner but also one with global expertise in enterprise sales and marketing that could help boost their business. They found it with Oracle for Startups.

 

“We have lots of concurrent and parallel processes with very large data volumes,” said Nash. “We are checking historical data against thousands of variables, anomaly detections, massive calculations processing, and storage. And it’s all optimized with Oracle Cloud.”

 

Nash, who migrated off Google Cloud, says Oracle Cloud gives him the confidence that his solution can handle these workloads and data sets without downtime or performance lapses. The partnership also gives him a credible technology that is native to many clients.

 

“As we have potential clients that come to us that are using Oracle, having our software on Oracle Cloud infrastructure will make it easier for us to deploy and scale. A seamless client experience is a critical success factor for us.”

 

Nash says the Oracle startup program‘s free cloud credits and 70% discount has allowed them to save costs while increasing value to customers. He also takes advantage of the program’s resources including introductions to customers and marketing and PR support.

 

“We’ve been impressed by the resources and dedication of Oracle for Startups team,” he said. “I’d recommend it, especially for AI and data startups ready for global scale.”

 

 

Beyond mining: superforecasting futures with AI

 

Beyond mining, Complete Intelligence is working with customers in oil and gas, chemicals, electronics, food and beverages, and industrial manufacturing. From packaging to polymers and sugar to sensors, these customers use Complete Intelligence for cost and revenue planning, purchasing and supply chain proactive planning, risk management, and auditing teams, as well as general market and economic forecasts.

 

The error rates for Complete Intelligence forecasts in energy and industrial metals performed 9.4% better than consensus forecasts over the same period, and Complete Intelligence continues to add methods to better account for market shocks and volatility.

 

OZ Minerals’ McFadyen said, “This is the next step in how economists can work in the future with change leading towards better forecasts, which will inform better decisions.”

 

Nash and Complete Intelligence are betting on it – and building for the future.

Categories
Podcasts

Stories from the Cloud: The Forecast Calls For…

Tony Nash joins veteran journalists Michael Hickins and Barbara Darrow at the Stories from the Cloud podcast to talk about the forecast calls for businesses, and how AI and machine learning can help in predicting the futures in budget forecasting. How does his company Complete Intelligence dramatically improve forecast accuracy of companies suffering from a huge 30% error rate. He also explained the AI technology behind the CI solutions and strategic toolkit, and how this practically applies to global companies. How can they benefit from this new technology to better be prepared in their budget planning and reducing risks in costs?

 

Stories from the Cloud description:

It’s not easy to predict the future. But when it comes to business and cash or financial forecasting tool or software, the right data and the right models are better than any crystal ball.

 

Tony Nash, CEO and founder of Complete Intelligence, explains how AI and the cloud are giving companies better cash forecasting software tools to see into their financial futures.

 

About Stories from the Cloud: Enterprises worldwide are turning to the cloud to help them thrive in an ever-more-competitive environment. In this podcast, veteran journalists Michael Hickins and Barbara Darrow chat with the people behind this massive digital transformation and the effects it has on their work and lives.

 

Show Notes

 

SFC: Hey, everybody, welcome back to Stories from the Cloud sponsored by Oracle. This week, I am here, as always, with Michael Hickins, formerly of The Wall Street Journal. I am Barbara Darrow. And our special guest today is Tony Nash. He’s the founder and CEO of Complete Intelligence. And this is a very interesting company. Tony, thanks for joining us. And can you just tell us a little bit about what the problem is that Complete Intelligence is attacking and who are your typical customers?

 

Tony: Sure. The problem we’re attacking is just really bad forecasting, really bad budget setting, really bad expectation setting within an enterprise environment. Companies have packed away data for the last 15, 20 years, but they’re not really using it effectively. We help people get very precise, very accurate views on costs and revenues over the next 12 to twenty four months so they can plan more precisely and tactically.

 

SFC: It sounds like a big part of the mission here is to clean up… Everybody talks about how great data is and how valuable it is. But I mean, it sounds like there’s a big problem with a lot of people’s data. And I’m wondering if you could give us an example of a company, let’s just say a car maker and what you can help them do in terms of tracking their past costs and forecasting their future costs.

 

Tony: So a lot of the problem that we see, let’s say, with the big auto manufacturer, is they have long-term supply relationships where prices are set, or they’ve had the same vendor for X number of years and they really don’t know if they’re getting a market cost, or they don’t have visibility into what are those upstream costs from that vendor. And so, we take data directly from their ERP system or their supply chain system or e-procurement system and we come up with very specific cost outlooks.

 

We do the same on the sales revenue side. But say for an automaker, a very specific cost outlook for the components and the elements that make up specific products. So we’ll do a bill of material level forecast for people so that they can understand where the cost for that specific product is going.

 

Before I started Complete Intelligence, I ran research for a company called The Economist and I ran Asia consulting for a company called IHS Markit. And my clients would come to me and say, there are two issues at both companies. Two issues. First is the business and financial forecasting tool or even strategic toolkit that people buy off the shelf has a high error rate. The second issue is the forecasts don’t have the level of context and specificity needed for people to actually make decisions. So what do you get? You get very generic data with imprecise forecasts coming in and then you get people building spreadsheets and exclusive models or specific models within even different departments and teams and everything within a company.

 

So there are very inconsistent ways of looking at the world. And so we provide people with a very consistent way and a very low error way of looking at the future trajectory of those costs and of those revenues.

 

SFC: So I’m curious, what is the what is the psychology of better business forecasting software? So on your customers and I’m thinking, if I’m a consumer, so this is maybe not a good analogy, but if I’m a consumer and I look at the actual costs are of a phone that I may have in my pocket, I may think, jeez, why making a thousand dollars for this? But then part of me says, such things mark up and well, I guess so. There are uncertainties in financial projections. So on me, I mean, I don’t need a financial projection software to tell me that the components of the pocket computer have don’t add up to what I paid for them. But I kind of understand that there needs to be money made along the way. I just I want it. Right. How does that translate on a B2B perspective? What are the people’s attitude about price and how do they react to the data that, as you said, I mean, heretofore, it’s kind of been unreliable.And all of a sudden, I think you say a lot of procurement projections have been around 30 percent, which is huge. Right. So how does that happen and how do people react to something that seems more trustworthy?

 

Tony: Well, I think that expectations depend on the level within a manufacturing company that you’re talking to. I think the more senior level somebody is, of course, they want predictability and quality within their supply chain, but they’re also responsible to investors and clients for both quality and cost. And so at a senior level, they would love to be able to take a very data driven approach to what’s going on. The lower you get within a manufacturing organization, this is where some of the softer factors start to come in. It’s also where a lot of the questionable models are put in as well.

 

Very few companies that we talk to actually monitor their internal error rates for their cost and revenue outlooks. So they’ll have a cost business forecasting software model or a revenue forecasting model that they rely on because they’ve used it for a long period of time, but they rarely, if ever, go back and look at the error rates that that model puts out. Because what’s happening is they’re manually adjusting data along the way. They’re not really looking at the model output except for that one time of the year that they’re doing their budget.

 

So there really isn’t accountability for the fairly rudimentary models that manufacturing companies are using today. What we do is we tell on ourselves. We give our clients our error rates every month because we know that no no business forecasting software model is perfect. So we want our clients to know what the error rate is so that they can understand within their decision making processes.

 

SFC: And it’s kind like a margin of error in a political poll?

 

Tony: Yeah, we use what’s called MAPE – mean absolute percent error. Most error calculations. You can game the pluses and minuses. So let’s say you were 10 percent off, 10 percent over last month and 12 percent under this month. OK. If you average those out, that’s one percent error. But if you look at that on an absolute percent error basis, that’s 11 percent error. So we gauge our error on an absolute percent error basis because it doesn’t matter if you’re over under, it’s still error.

 

SFC: Still wrong, right.

 

Tony: Yeah. So we tell on ourselves, to our clients because we’re accountable. We need to model the behavior that we see that those senior executives have with their investors and with their customers, right? An investment banking analyst doesn’t really care that it was a plus and a minus. They just care that it was wrong. And they’re going to hold those shares, that company accountable and they’re going to punish them in public markets.

 

So we want to give those executives much better data to make decisions, more precise decisions with lower error rates so they can get their budgeting right, so they can have the right cash set aside to do their transactions through the year, so they can work with demand plans and put our costs against their say volume, demand plans, those sorts of things.

 

SFC: I have to just ask I mean, Michael alluded to this earlier, but I want to dive into a little more. You had said somewhere else that most companies procurement projections are off by 30 percent. That’s a lot. I mean, I know people aren’t… I mean, how is that even possible?

 

Tony: It’s not a number that we’ve come up with. So first, I need to be clear that that’s not a number that we’ve come up with and that’s not a number that’s published anywhere. That’s a number that we consistently get as feedback from clients and from companies that we’re pitching. So that 30 percent is not our number. It’s a number that we’re told on a regular basis.

 

SFC: When you start pitching a client, obviously there’s a there’s a period where they’re just sort of doing a proof of concept. How long does that typically last before they go? You know what? This is really accurate. This can really help. Let’s go ahead and put this into production.

 

Tony: Well, I think typically, when we when we hit the right person who’s involved in, let’s say, category management or they actually own a PNL or they’re senior on the FPNA side or they’re digital transformation, those guys tend to get it pretty quickly, actually. And they realize there’s really not stuff out there similar to what we’re doing. But for people who observe it, it probably takes three months. So our pilots typically last three months.

 

And after three months, people see side by side how we’re performing and they’re usually convinced, partly because of the specificity of projection data that we can bring to to the table. Whereas maybe within companies they’re doing a say, a higher level look at things. We’re doing a very much a bottom up assessment of where costs will go from a very technical perspective, the types of databases we’re using, they’re structured in a way that those costs add up.

 

And we forecast at the outermost leaf node of, say, a bill material. So uncertainties in financial projections are solved. A bill Of material may have five or 10 or 50 levels. ut we go out to the outermost kind of item within that material level, and then we add those up as the components and the items stack up within that material. Let’s say it’s a mobile phone, you’ll have a screen, you’ll have internal components. You’ll have the case on the outside. All of this stuff, all of those things are subcomponents of a bill of material for that mobile phone.

 

SFC: So I am assuming that there is a big role here in what you’re doing with artificial intelligence, machine learning. But before we ask what that role is, can you talk about what you mean by those terms? Because we get a lot of different definitions and also differentiations between the two. So maybe talk to the normals here.

 

Tony: OK, so I hear a number of people talk about A.I. and they assume that it’s this thinking machine that does everything on its own and doesn’t need any human interaction. That stuff doesn’t exist. That’s called artificial general intelligence. That does not exist today.

 

It was explained to me a few years ago, and this is probably a bit broader than most people are used to, but artificial intelligence from a very broad technical perspective includes everything from a basic mathematical function on upward. When we get into the machine learning aspect of it, that is automated calculations, let’s say, OK. So automated calculations that a machine recognizes patterns over time and builds awareness based on those previous patterns and implies them on future activities, current or future activity.

 

So when we talk about A.I., we’re talking about learning from previous behavior and we’re talking about zero, and this is a key thing to understand, we have zero human intervention in our process. OK, of course, people are involved in the initial programming, that sort of thing. OK, but let’s say we have a platinum forecast that goes into some component that we’re forecasting out for somebody. We’re we’re not looking at the output of that forecast and go, “Hmmm. That doesn’t really look right to me. So I need to fiddle with it a little bit to make sure that it that it kind of looks right to me.” We don’t do that.

 

We don’t have a room of people sitting in somewhere in the Midwest or South Asia or whatever who manually manipulate stuff at all — from the time we download data, validate data, look for anomalies, process, forecast, all that stuff, and then upload — that entire process for us is automated.

 

When I started the company, what I told the team was, I don’t want people changing the forecast output because if we do that, then when we sit and talk to a client and say, hey, we have a forecast model, but then we go in and change it manually, we’re effectively lying to our customers. We’re saying we have a model, but then we’re just changing it on our own.

 

We want true kind of fidelity to what we’re doing. If we tell people we have an automated process, if we tell people we have a model, we really want the output to be model output without people getting involved.

 

So we’ve had a number of unconventional calls that went pretty far against consensus that the machines brought out that we wouldn’t have necessarily put on our own. And to be very honest, some of them were a little bit embarrassing when we put them out, but they ended up being right.

 

In 2019, the US dollar, if you look at, say, January 2019, the US dollar was supposed to continue to depreciate through the rest of the year. This was the consensus view of every currency forecaster out there. And I was speaking on one of the global finance TV stations telling them about our dollar outlook.

 

And I said, “look, you know, our view is that the dollar will stabilize in April, appreciate in May and accelerate in June.” And a global currency strategist literally laughed at me during that interview and said there’s no way that’s going to happen. In fact, that’s exactly what happened. Just sticking with currencies, and for people in manufacturing, we said that the Chinese Yuan, the CNY, the Renminbi would break seven. And I’m sure your listeners don’t necessarily pay attention to currency markets, but would break seven in July of 19. And actually it did in early August. So that was a very big call, non consensus call that we got months and months ahead of time and it would consistently would bear out within our forecast iterations after that. So we do the same in say metals with things like copper or soy or on the ag side.

 

On a monthly basis, on our base platform, we’re forecasting about 800 different items so people can subscribe just to our data subscription. And if they want to look at ag, commodities, metals, precious metals, whatever it is, equities, currencies, we have that as a baseline package subscription we can look at, people can look at. And that’s where we gauge a lot of our error so that we can tell on ourselves and tell clients where we got things right and where we got things wrong.

 

SFC: You know, if I were a client, I would I would ask, like, OK, is that because you were right and everyone else is wrong? Is that because you had more data sources than anyone else, or is it because of your algorithm or is it maybe because of both?

 

Tony: Yes, that would be my answer. We have over 15 billion items in our core platform. We’re running hundreds of millions of calculations whenever we rerun our forecasts. We can rerun a forecast of the entire global economy, which is every economy, every global trade lane, 200 currency pairs, 120 commodities and so on and so forth. We can do that in about forty seven minutes.

 

If somebody comes to us and says, we want to run a simulation to understand what’s going to happen in the global economy, we can introduce that in and we do these hundreds of millions of calculations very, very quickly. And that is important for us, because if one of our manufacturing clients, let’s say, last September, I don’t know if you remember, there was an attack on a Saudi oil refinery, one of the largest refineries in the world, and crude prices spiked by 18 percent in one day.

 

There were a number of companies who wanted to understand the impact of that crude spike on their cost base. They could come into our platform. They could click, they could tell us that they wanted to rerun their cost basis. And within an hour or two, depending on the size of their catalog, we could rerun their entire cost base for their business.

 

SFC: By the way, how dare you imply that our listeners are not forex experts attuned to every slight movement, especially there’s no baseball season. What else are we supposed to do? I wanted to ask you: to what extent is the performance of the cloud that you use, you know, important to the speed with which you can provide people with answers?

 

Tony: It’s very important, actually. Not every cloud provider allows every kind of software to work on their cloud. When we look at Oracle Cloud, for example, having the ability to run Kubernetes is a big deal, having the ability to run different types of database software, these sorts of things are a big deal. And so not all of these tools have been available on all of these clouds all the time. So the performance of the cloud, but also the tools that are allowed on these clouds are very, very important for us as we select cloud providers, but also as we deploy on client cloud. We can deploy our, let’s say, our CostFlow solution or our RevenueFlow solution on client clouds for security reasons or whatever. So we can just spin up an instance there as needed. It’s very important that those cloud providers allow the financial forecasting tools that we need to spin up an instant so that those enterprise clients can have the functionality they need.

 

SFC: So now I’m the one who’s going to insult our readers or listeners rather. For those of us who are not fully conversant on why it’s important to allow Kubernetes. Could you elaborate a little bit about that?

 

Tony: Well, for us, it has a lot to do with the scale of data that’s necessary and the intensity of computation that we need. It’s a specific type of strategic toolkit that we need to just get our work done. And it’s widely accepted and it’s one of the tools that we’ve chosen to use. So, for example, if Oracle didn’t allow that software, which actually it is something that Oracle has worked very hard to get online and allow that software to work there. But it is it is just one of the many tools that we use. But it’s a critical tool for us.

 

SFC: With your specialization being around cost, what have you looked at… Is cost relevant to your business and so on cloud? How so?

 

Tony: Yeah, of course it is. For us, it’s the entry cost, but it’s also the running cost for a cloud solution. And so that’s critically important for us. And not all cloud providers are created equally. So so we have to be very, very mindful of that as we deploy on a cloud for our own internal reasons, but also deploy on a client’s cloud because we want to make sure that they’re getting the most cost effective service and the best performance. Obviously, cost is not the only factor. So we need to help them understand that cost performance tradeoff if we’re going to deploy on their cloud.

 

SFC: Do you see this happening across all industries or just ones where, you know, the sort of national security concerns or food concerns, things that are clearly important in the case of some kind of emergency?

 

Tony: I see it happening maybe not across all industries, but across a lot of industries. So the electronics supply chain, for example, there’s been a lot of movement toward Mexico. You know, in 2018, the US imported more televisions from Mexico than from China for the first time in 20 some years. So those electronics supply chains and the increasing sophistication of those supply chains are moving. So that’s not necessarily sensitive electronics for, say, the Pentagon. That’s just a TV. Right. So we’re seeing things like office equipment, other things. You know, if you look at the top ten goods that the US receives from China, four of them are things like furniture and chairs and these sorts of things which can actually be made in other cheaper locations like Bangladesh or Vietnam and so on. Six of them are directly competitive with Mexico. So PCs, telecom equipment, all these other things.

 

So, you know, I actually think that much of what the US imports will be regionalized. Not all of it, of course, and not immediately. But I think there’s a real drive to reduce supply chain risk coming from boards and Coming from executive teams. And so I think we’ll really start to see that gain momentum really kind of toward the end of 2020 and into early 2021.

 

SFC: That is super interesting. Thank you for joining us. We’re kind of up against time, but I want to thank Tony for being on. I want to do a special shout out to Oracle for startups that works with cool companies like Complete Intelligence. Thanks for joining us. Please try to find Stories from the Cloud at on iTunes or wherever you get your podcasts and tune in again. Thanks, everybody.

Categories
News Articles

How to Make Cloud Pricing More Transparent

This article on “How to Make Cloud Pricing More Transparent” is originally published at https://www.eweek.com/cloud/how-to-make-cloud-pricing-more-transparent

 

eWEEK CLOUD PERSPECTIVE: It used to be nearly impossible to compare cloud costs because different providers typically have their own nomenclature for cloud features, define services differently and offer different tiers of services that don’t line up with one another. Forget apple-to-apple comparisons, cloud price bake-offs were more like contrasting apples to peach cobblers. But help is here.

 

Cloud has inspired almost as much evangelical fervor as open source computing, particularly in the heady 2000s. The advent of cloud computing seemed to render traditional enterprise software vendors as out-of-date as telegraph operators. The monolithic process of releasing software every 18 months wasn’t fast enough for business, running your own servers became as fashionable as generating your own electricity, and the expense involved restricted technology access to the wealthiest businesses.

 

Cloud computing represented a true democratization of enterprise IT, allowing small companies to compete with bigger rivals without breaking the bank to buy servers, storage and software. Tens of millions of dollars for the right to walk onto the playing field were no longer required.

 

The other promise of cloud computing was of a more transparent and equitable business model.

 

In one of my first interviews as an IT reporter, in 2003, I asked the chief technology officer of a large health IT organization to define enterprise software. “It’s when they can’t tell you the price of the software upfront,” he said.

 

Sure, this lack of transparency reflected the complexity of the forecasting applications on offer, but also showed that the dominant sales model gave more power to vendors than customers.

 

The emergence of profitable cloud-native businesses both threatened existing business models and inspired business transformation. The agility and innovation made possible by cloud computing inspired many businesses to move their IT stacks from their own server rooms or data centers to the cloud.

 

 

The law of universal gravitation as applied to the cloud

 

By 2020, however, the low-hanging fruit has been picked. Businesses have reaped the benefits of relatively lower costs and more frequent innovation. And with the lion’s share of IT spending at most companies moving into the cloud, cost – and cost transparency – matters. Yet, the transparency promised by the cloud revolution has largely failed to materialize.

 

As was the case with the previous generation of technology, obfuscation isn’t a bug, it’s a feature, and it begins with Newton’s Law of Universal Gravitation. Pricing structures at legacy cloud providers punish moving data from one cloud to another. By intentionally making the cost of putting data into their clouds as low as possible, while making it prohibitively expensive to move data out to interact with systems in different clouds—a concept known as data gravity—they are walling in their customers.  This is an explicit strategy to make their clouds “sticky” and keep forecasting applications from moving to other clouds.

 

But the reality is that businesses want and need to operate in different cloud environments for many reasons. Not to mention, who wouldn’t want to cut 10, 30, or even 80 percent of cloud costs if possible?

 

 

 

Newton’s law of motion applied to the cloud

 

It used to be nearly impossible to compare cloud costs because different providers typically have their own nomenclature for cloud features, define services differently and offer different tiers of services that don’t line up with one another. Forget apple-to-apple comparisons, cloud price bake-offs were more like contrasting apples to peach cobblers.

 

There is help available. For one example, Oracle Cloud Workload Cost Estimator is a new tool now available for obtaining empirical cost information. It lets customers assess comparative costs of Oracle Cloud Infrastructure and Amazon Web Services in as close to a real apples-to-apples comparison as possible.

 

The calculator prices not only computing and storage costs, but that of IOPS (data input/output per second), and data transmission out of the cloud as well. That last factor, also known as data egress, is usually a wild card because traditional cloud companies start charging a markup after a given amount of data flows out. So once you hit a monthly target—1GB for AWS, according to the cost estimator—data egress charges kick in. At Oracle the meter doesn’t start until after 10,000 times more data egress—or 10 TB—per month.

 

IT leaders can enter the parameters of proposed workloads and then run their own OCI vs. AWS comparisons. In the end, they may discover that one cloud provider offers services that are closer to Newton’s third law (that for every action in nature, there is an equal and opposite reaction) than to his first

 

 

 

A few examples

 

Cost and performance go hand in hand, especially as software-as-a-service providers rely on third parties to serve their software to customers. Data technology firm Complete Intelligence, for instance, provides real-time risk management and forecasting services for its customers. It needs to know how much it will spend providing that service on an ongoing basis, and also be sure that its customers get the responsive service their businesses need.

 

“For us, it’s the entry cost, but it’s also the running cost for a cloud solution. And so that’s critically important for us. And not all cloud providers are created equally,” said Tony Nash, CEO of the Houston-based company, which picked Oracle Cloud Infrastructure.

 

Another example of how modern businesses use the cloud is data integration provider Naveego. The company helps customers parse data from a myriad of sources. It cleans the data, deletes duplicates, provides a trail of sources, and then provides a clean golden record of data that is ready for analytics in real time.

 

“To do that, we run instances of our product in multiple availability zones. AWS charges for communications back and forth between those availability zones. Oracle doesn’t, and the cost difference ended up being huge for us. So, we decided to move our research and development, and some production, cloud tenancies to Oracle Cloud,” wrote Naveego CEO Katie Horvath in a blog post.

 

The company saved 60 percent on its costs since moving to the Oracle cloud, while being able to do more research and development. “Oracle’s claims that Oracle Cloud Infrastructure is 65 percent more cost effective on computers have also proven to be true for Naveego,” she says.

 

We’re starting a new decade on an awkward footing, and businesses need technology to help make smarter decisions. They may still want to fail fast, but they will also want to know what went wrong fast, what the fast road looks like to the promised land – and at long last, what it costs to get there. They’ve long known the cost of sending a telegram, and they can finally figure out the cost of using the cloud.

 

Michael Hickins is a former eWEEK and Wall Street Journal editor and reporter.

Categories
News Articles

Blockchain, big data and the value for global trade

This article is originally published at https://www.ibtimes.com/blockchain-big-data-value-global-trade-2582472

 

Most data turns out to have a greater value than the sum of the parts. There’s a story about a global courier firm that said it saw a large drop off in its monthly orders at some point in 2007, not too long before the bottom fell out of the global economy. Traditional economic business forecasting software did not see an issue, but had there been some visibility into trade finance data at that time it would have shown many contracts had been cancelled, affording us some warning of what was ahead.

 

Macro-economic forecasters and statistical analysts know that trade data provides the most precise window into the global economy there is. Trade finance data has always been a notoriously opaque part in the supply chain, but we are now seeing end-to-end digitisation, as a multitude of banks and software providers test out trade/supply blockchains and other digital platforms.

 

Supply chains have traditionally transacted using antiquated processes and physical letters of credit, bills of lading and purchase orders. This is now being rapidly digitalised, making it easy to aggregate and also access.

 

Christian Lanng, CEO Tradeshift (a trade finance and supply chain solution working with HSBC and Santander), said: “Over the next two to five years we are going to see massive digitalisation of that global trade dataset. It’s moving from offline to online. It also means you can trade on it in a whole different way, leverage it in different ways. You can use it for research and in other contexts.”

 

Tradeshift supports over 100 different transacting types, like bills of lading, letters of credit, invoices, purchase orders and more. Lanng said one of the main reasons you are seeing lots of data aggregation is because of cloud computing, which has driven down the cost of the procedure. “All this is built on open powerful platforms. We have open APIs. We can build third party applications that can leverage this data.”

 

Today there’s whole industry that revolves around alternative data that can be cleaned and readied for use by alpha-hungry asset managers and hedge funds. But while a company like Visa has one of the most valuable datasets on the planet, it would never reveal transaction data about its customers. The same could be said about Maersk, which has at its fingertips a snapshot of more than half the world’s supply chains.

 

So while the platforming of supply chains and the use of shared ledgers will undoubtedly make this data more immediate and more granular – the question is will it ultimately become more accessible?

 

Tony Nash, CEO at data analytics firm Complete Intelligence, says trade datasets are a very good proxy of global supply/demand, but trade finance remains a missing piece.

 

“Trade lets us know exactly what’s happening on supply chains, you know exactly what’s happening with finished goods. The problem with trade finance today is that there is huge amount of opacity. Individual banks and trade finance firms aren’t really that willing to surrender their trade finance data, so anything around trade finance that’s public is really estimated.

 

“I think blockchain will bring some transparency to trade finance and actual aggregate values, and will do it in a much more real time sense. A lot of this information is so far delayed that you don’t really know what’s going on.”

 

“There isn’t a global source for this stuff. We’ve got goods data; we’ve got some services data, and all this currency commodity macro data. Trade finance adds a very interesting layer in terms of the timing of impacts and how they impact the cyclical nature of trade.”

 

Those who follow the enterprise blockchain space know that data privacy is sine qua non – as much between participant banks running nodes as anyone else. Professional data collection and curation players are well aware of this fact. But they see the potential and are watching the space closely.

 

Tammer Kamel, CEO of Quandl, a well-known provider of data insights to the asset management industry, said: “Almost all our hedge fund customers covet supply chain insights. While Quandl has some data now that illuminates part of the picture, gaps remain. Blockchain adoption could well be what throws more light on things.

 

“That said, most supply chain participants have strong incentives to keep the details of their business operations confidential. In this space, as in others, we are watching blockchain adoptions closely to see what powerful data will emerge as the by-product.”

 

Blockchain technology is very much in the offing and there are likely to be many variations in design as different use cases are fully explored. Amber Baldet, blockchain programme lead at JP Morgan, has overseen the creation of an extremely security-conscious modification of the Ethereum public blockchain called Quorum, which is aimed at enterprise uses. However, she said some blockchain uses – and she used global trade as an example – will likely see lots of value in the future by being spread across very large networks of users.

 

“Everybody is looking at supply chain on blockchain. If you only want to internalise receivables flows or letters of credit – markets between a relatively small number of banks – then you are fine using an enterprise blockchain solution purpose built for a small group of semi-trusted parties,” said Baldet.

 

“But if you think that over time you want to add thousands or millions of end point suppliers to this thing; not only corporates, but perhaps actual vendors making fabrics and very little shops … any sufficiently adopted sort of permissioned chain starts to look a lot more like a public chain.”

 

This type of blockchain, while not open to the entire world, might become so adopted that we will gradually see security/utility trade-offs. At some point this seems to bleed into the old internet/intranet argument often invoked by opponents of private blockchains.

 

Jeremy Epstein, CEO of Never Stop Marketing has spent some time thinking about the intersection of big data and blockchains and how this might play out. He sees this phase of blockchain construction as similar to the big excitement in the late 90s around corporate intranets. “Over time, there will be a realisation that the real value is in permissionless, or the internet version of the blockchain. I think what you’ll start to see is new supply chains that are being built around these sort of public blockchains,” said Epstein.

 

He pointed out that companies like Google and Facebook lead the way in artificial intelligence and machine learning at the moment partly because they have access to the most data. But that might change with a more open, decentralised internet.

 

“In the short term, big companies with massive amounts of data are going to have an advantage. But I think eventually we will see the value of the public blockchain; the ease of innovation on top of it, that’s the real differentiator.

 

“Collaboration is going to happen as more and more of these small players start jumping on, creating nodes and micro-niche apps. It could be a Kenyan coffee supply chain node on this global coffee supply chain.

 

“Big companies can’t innovate at the micro level because it’s not valuable or profitable for them. But some entrepreneur in Nairobi can. And all of a sudden the data on public blockchains starts exploding. So if I’m going to analyse what’s happening in the global coffee market, I can look at this public blockchain; I can put my AI and machine learning algorithms on top of it,” he said.

 

“The more data wins, and eventually there’s going to be more data in the public blockchains. No private organisation or even private ecosystem is going to be able to compete with billions of people.”