We are passionate about data
Design and implementation of data warehouse is our core competency. Our data warehouse framework can streamline the development process and optimize maintenance cost. With many years of experiences with Data Warehouse projects in large international organizations, we can ensure the quality and time of our delivery for every project.
Big data helps a business to understand the fast-changing market and to follow-up with the customer needs ahead of time. Collecting, managing and appying big data technologies require certain expertises. Our consultant has concrete experiences with big data solutions using Hadoop, Spark, Python and Scala. We delivery stable solution that fulfills specific needs of our clients.
Data science helps a business to better understand the internal and external possibilities. It is a powerful tool in the hands of the right data scientist. We have deep knowledge and exeriences with data science tools. Besides delivering concrete solutions, we are also capable of helping our clients to build up a data science team across the time.
HarryConsulting implemented an data warehouse (DW) platform for the Capital Market and Investment Management area at a large financial institution in Denmark. The EDW platform later on is extended for new scope and became an enterprise data warehouse system. The implementation took 12 months, starting from requirement analysis and business data modeling to the final system deployment and hyper-care period. The Microsoft BI stack, aka., SQL Server, SSIS, SSAS, SSRS, MDS and C# are used as the fundimental tools in the implementation. A data warehouse automation concept and framework is also applied to optimize and streamline the development and maintenance work.
This blog here provides more detail about the automation framework
We have been participating the implementation of the next generation contingenancy and capacity management application for the largest shipment corportaion in the world. In this large project problem, HarryConsulting delivers the data modeling, data analysis and development work to build up the whole data integration flow and central database for all tracks of the new application. Several generations of data archtectures have been implemented and utilized in the solution, starting from traditional batch-driven, to the event-driven processing and a hybrid data integration architecture. Microsoft Azure Cloud is chosen as the only cloud partner of the project and all implementations are using Azure technologies such as Azure SQL, Azure Data Factory, Azure Data Lake, Azure Databricks, Event Hub, Azure Blob Storage and so on.
HarryConsulting has worked together with Aalborg University to create an self-adaptive ETL technology to solve the challenges of unstable source data delivery in many ETL solutions. The technology has inspired discussions and resulted in a joint work with a startup BI consulting company in Copenhagen.
This blog describes details about the technology.