How Azure Data Factory can help process huge files

One of our clients recently faced an issue with loading and summarizing increasing volumes of data using a legacy database. In addition to a sizable bulk of historic data, large amounts of additional data were being generated regularly. By this point the data had become so voluminous as to push the limits of the existing system’s processing capacity. It was starting to consume unreasonable amounts of processing time as well as the end users’ time for report generation.

Cloud Strategy – A Company Must-Have

When cloud first arrived, it was an exciting option to help CTOs and CIOs reduce costs and...

Digital Transformation in Utilities

The world of utilities is changing. Despite being slow to begin digital transformation due to...

10 Ways to Derail an Artificial Intelligence Program

Despite big investments, many organisations get disappointing results from their Artificial...

Results-Driven Software Engineering Teams

Behind every significant innovation, you can bet there’s an awesome team that did the job of...

How to make the most of Data Science

Everyone wants to get the most from artificial intelligence technologies. Everyone,...

Which Agile Method Do I Use

Knowing which agile methodology to use is something that can be a sticking point for many in...

Does Artificial Intelligence Matter to Business Leaders?

Artificial intelligence technologies are revolutionary. They are the tools we use to get the...