Place Your Large Billboard Ads Here

Al computer to unleash real time GDP data

As we await Gross Domestic Product information to be discharged on weekday, Massey University has launched a high technical school real time value tracker which could revolutionise the approach we keep track of the nation's economic progress.

The gdpLive project aims to make a true time model of GDP using up-to-date data like PayMark's electronic card payment figures.

GDP data due on are going to be for the third quarter - to september 30.

That means that a number of the data it's based on are going to be virtually six months out of date.

This project was a trial to use the latest computer technology and therefore the growing volume of digital data that organisations collect to higher model value in real time, says Christoph Schumacher, professor of economic science and innovation at the Massey business school.

It uses advanced self-learning (artificial intelligence) to perpetually improve its modelling.

"If we all know what proportion money exchanged hands yesterday using cards, then that's a fairly good indication of economic activity as a result of gdp measures market based transactions of how many product are sold ," says Schumacher.

It has a partnership with Port connect to give data from Ports of Auckland and Tauranga to observe movement of goods in and out of the country.

KiwiRail offered freight data and - via the Interislander Ferry - real time tourism data.

The project also pulled in publicly available data from government sources like Stetson, The reserve bank, Immigration NZ, NZ Transport Agency and therefore the Ministry of Business Innovation and Employment.

They also used traffic data from the Ministry of Transport, all the economic science indicators from Stats NZ and the reserve bank.

It has been developed by the knowledge Exchange Hub, a multi-disciplinary analysis hub at Massey University.

The project is headed by Schumacher and Dr Teo Susnjak, a computer scientist with Massey University's Institute of Natural and Mathematical Sciences.

Schumacher says the key was to create a programme that would process all the up-to-date data and also frequently benchmark it against historic data.

"So it keeps learning in little steps and once we expose it to the new data, with what it's learned from the past it then makes a prediction."

The project had been current for 3 years and the algorithmic program had been learning using data with information going back 10 years.

So far it had proved  extremely correct and within expected margin of error, Schumacher says.

For example it successfully anticipated the sturdy second quarter gdp data in June that shocked several economists.

"But this may solely get better," he said. "If one thing fully unexpected were to happen our algorithmic program can learn from this."
Place Your Large Rectangle

If You Enjoyed this Post, Kindly take 5 Seconds and Share it With Your Friends on

No comments:

Post a Comment