Leveraging data in a strategic way can yield transformative outcomes. Integrating disparate data sources presents an opportunity to transform data into a strategic asset, unlocking pathways to sustainable growth and innovation. At Torry Harris, we offer comprehensive data solutions to help enterprises manage their data modernization initiatives and unlock pathways to sustainable growth and innovation.

Over 25 years of extensive delivery experience in data management and data analytics.
A qualified team of over 500+ data engineers, designers, and architects.
End-to-end solutions that include sources, data sets, users, workloads, and governance.
Simple, scalable and cost optimized data solutions.

Data
consulting

By meticulously analyzing your digital infrastructure, we identify opportunities for maximum utilization and repurposing. We emphasize simplicity, scalability, and adaptability, ensuring that our solutions not only meet your current needs but also evolve alongside your organization's growth trajectory.

Data
transformation

We bring our expertise to give your digital estate a complete overhaul with modern data platform setup and activate capabilities that make your landscape future ready for seamless integration of big data and on demand actionable advanced analytics.

Data architecture, design &
implementation

We architect your data transformation to ensure an efficient and future-proof platform supplemented with actionable insights – we design solutions with minimal technical debt.

Data automation tools and digital accelerators

We introduce automation tools and solution accelerators leveraging concepts and frameworks of AI, ML and Generative AI to not only reduce the CAPEX and OPEX costs but also facilitate your transition to AIOps and a self-serve environment.

What data services do you need?
What is our expertise?
What tools do we use?
  • Data transformation
  • Data virtualization
  • Data governance
  • Data analytics
  • Data marketplace
  • Data architecture
  • Data integration
  • Data migration

  • Data lifecycle engineering on cloud/on prem/fog/edge
  • Batch processing
  • Data streaming
  • Storage migration
  • Database migration
  • Cloud migration

  • ETL/Intelligent ETL
  • RDBMS
  • NoSQL
  • Lakehouse, Datalake, Deltalake
  • Data warehouse
  • Big data processing
  • Data orchestration
  • Data archival

Data virtualization – Tableau dashboard acceleration

Data as a service
(DaaS)

Databridge – Energy supply
client

Master data management – Telco client​

Device data for network performance analysis

Problem:

The issue involved a lengthy data preparation phase required for Tableau visualization and a poor user experience on Tableau dashboards caused by query performance issues.

Business benefits:

  • Achieve 50-80% time saving for BI project delivery.
  • Tableau dashboard loading within five seconds, irrespective of the complexity.

Solution:

  • Data virtualization to accelerate data preparation for BI visualization requirements
  • DIY-based development for BI teams
  • Acceleration of Tableau queries through Smart Query (Summary) Acceleration technique

Problem:

The issue involved a lengthy Java development process required to expose data sets as APIs and complexity related to the distributed access control setup.

Business benefits:

  • Achieve 50-80% time saving on exposing data sets as APIs
  • Centralized, granular access control.

Solution:

  • Data virtualization to expose data sets as APIs out of the box
  • Centralized role-based access control to enforce data security at the view, column, and row level

Problem:

There was a need to connect corporate data residing in AWS data platform to Azure data platform in near real-time to improve recommendations.

Business benefits:

  • Improved quality and performance of plants.
  • Increased customer satisfaction – increased quality and traceability.
  • Reduced production costs by anticipating equipment issues and reducing cycle time.
  • Support for continuous manufacturing processes performance improvement.
  • Insights generated to plant and supply chain managers, maintenance and quality engineers, and performance experts to take data driven actions.

Solution:

Dev Portal – A single Sign-on enabled web application for data owners to manage publication and subscription of their data objects from corporate big data platform. Microservices architecture style is applied to build the API ready Dev Portal application.

Event Mediation Hub – A Kafka platform as a service to enable publish subscribe backbone for the enterprise, used as the data transportation mechanism. Component also hosts the sink connector based on Kafka Connect framework to push the data to Azure Event Hub.

Data Publisher – Intelligent ETL solution on AWS stack to publish data objects configured in Dev Portal to corresponding Kafka topics. The solution is data object agnostic and scalable to support future data publishing requirements of Intel DS.

Problem:

Data fragmentation

  • Multiple sources of business data siloed across multiple CFUs
  • No consistency for business, contact and consent data across systems
  • No single version of the truth

System sync latency

  • Daily refresh of data took up to 28 hours
  • Syncing between multiple CRM systems took up to 2-3 weeks
  • Syncs suffer from fallout, delaying processes further
  • Workarounds due to lag causing further sync/DQ issues

Sales/service

  • Duplication of businesses – missed sales opp, dup ownership, poor CX
  • Retention of ceased businesses – wasted sales & marketing effort
  • No single version of truth – unable to target effectively
  • Latency of updates – missed sales opp, incorrect business details

Maintain/fix

  • Minimal ability for business users to review/update DQ issues
  • Bespoke in-house process – limited status visibility, specialist resource to maintain, difficult to integrate to new applications.
  • Long delivery times – multiple process & system updates for one change

Business benefits:

  • Improved quality and performance of plants.
  • Increased customer satisfaction – increased quality and traceability.
  • Reduced production costs by anticipating equipment issues and reducing cycle time.
  • Support for continuous manufacturing processes performance improvement.
  • Insights generated to plant and supply chain managers, maintenance and quality engineers, and performance experts to take data driven actions.

Solution:

  • Single centralized master for Business, Contact, Consent & Sales Ownership
  • Creation & management of a golden record from external & internal sources
  • Systems sync’d to the MDM removing any dependencies
  • API first approach to integrating systems and applications
  • Master once, reuse many. MDM as source for any new applications requiring business data

Problem:

Data aggregation and processing were required to unify information from multiple mobile sites and cells, facilitating analytics on capacity, performance management, energy efficiency, and power throughput.

Business benefits:

  • Data generated from 20K sites and 260K cells related to signal frequency, uplink, downlink etc. is first collated and standardized in Greenplum DB. Around 30GB of data processed daily.

Solution:

  • Raw data stored in CCN. Dimensions and facts built on Big Query in DPN.
  • Formula Master contains all the KPIs and rules based on which standardization is done for fields like frequency, decibel levels.
  • Analytics using Looker and Vertex AI.