A utility that customers never think about is a utility that is providing the right service at the right price and at the right level of performance. And that’s exactly how utilities companies want it.
It might actually surprise you to learn just how much data analysis utilities companies do to make sure customers have no reason to think about their utility services.
The demands increase year after year—to support more and more people with resources that are shrinking further and further. In the 21st century utilities companies use data science to:
- Analyze electrical use patterns to manage production capacity more efficiently
- Monitor water consumption to rapidly isolate leaks or overuse
- Reduce and manage waste for more efficient use of limited landfill space
- Rapidly identify and resolve outages of all kinds
- Grand Canyon University - B.S. in Business Information Systems and M.S. in Data Science
- SNHU - A.S. in Data Analytics, B.S. in Computer Science, B.S. in Data Analytics, and M.S. in Data Analytics
- Syracuse University - M.S. in Applied Data Science: GRE Waivers available
- UC Berkeley - Master of Information and Data Science Online - Bachelor's Degree Required.
- Syracuse University - Master of Information Management Online
Data Flows Like Water and Water Flows Like Data
The ways water utilities companies use data science vary by region. In the desert Southwest, climatological data and river flow rate are the important factors. In the rainy Pacific Northwest, weather data is equally important, but forecasters look instead at seasonal rainfall data and snowpack accumulation. And in the water-rich Northeast, monitoring of the aging infrastructure and delivery systems draws the most attention.
Nationwide, up to 2 trillion gallons of clean water leak out of water systems yearly before ever reaching customers.
As with power companies, smart meters will be the future for water utilities. A more precise picture of flows and problems in their system will allow these companies to narrow down leaks, contamination, and other problems, and quite possibly fix them before customers are the wiser.
How Waste Management Companies Use Data Science to Manage Their Own Fuel Waste
As we have come to fully embrace recycling, data science has had a hand in transitioning America from a throwaway society– to a throwaway society that is selective about what bin we choose.
Environmental concerns and land use regulations have played a role in reducing the massive landfills that once were the first and last resort for waste disposal. But recycling requires more information than dumping—different materials require different process flows to different recyclers.
Waste management companies have come to embrace data science as a way to improve profitability.
Pioneering data science firms like San Francisco-based Compology are changing the game, building sensors and cloud-based analytical software called “Waste OS” that are designed to inexpensively monitor the loads on garbage trucks in real-time, improving routing and efficiency, and dramatically reducing fuel consumption. Incorporating the technology has been a no-brainer for waste service management companies looking at new ways to increase their bottom line without raising rates in what is a notoriously slow-growth industry.
Smart Meter Machine-to-Machine Networks Leverage the Power of Big Data to Keep the Lights On
With a grid already prepared for advanced instrumentation and transmission, electrical utilities have had a head start when it comes to deploying automation for the purpose of data collection.
The result of those efforts is the emergence of the Smart Grid. At customer sites, smart meters transmit usage data back to the utility. Along transmission lines and at switching stations, other instrumentation monitors voltage and detects faults. These “M2M,” or “machine-to-machine” networks provide a detailed overall picture of the health of the transmission system.
But the flood of data coming in from M2M networks—potentially up to 1,000 petabytes of data a year—has largely overwhelmed the ability of utility companies to process it. Smart grids have been described as “a solution in search of a problem.”
Few utilities companies are equipped to engage in the deep analysis of customer information needed to truly revolutionize the power grid. That’s where data scientists come in.
Identifying “Energy Personalities”
Opower, a California company that manages 40 percent of America’s residential energy data, is starting to make inroads with the information that power utility companies collect on their customers. Using machine learning algorithms, Opower analyzed hundreds of thousands of load curves—the basic energy use profile for a customer over a given 24-hour period—and identified five basic “energy personalities” that describe most Americans.
What’s next? Well, Opower thinks that the next leap past smart metering to the home is smart metering to individual loads within the home. Utilities can only achieve so much by treating a house as a unitary consumer of energy, when the reality is that the usage patterns are defined by the unique combinations of appliances and people within the home.
Customers are already doing some of this work for power companies, using smart grid information to pare down their own bills. But when power companies talk about efficiency, they’re not necessarily talking about saving you twenty bucks. What they are really talking about is demand prediction: using information on how and when millions of different households and individual customers use electricity .
Looking Through Customer Data to Predict Electrical Demand
Conventional power plants take time to spin up to full production capacity. Consequently, to avoid brownouts, utilities err on the side of caution and produce electricity in excess of the forecast demand. The larger the margin of error in the forecast model, the greater the excess capacity is required in the system. But excess capacity is wasteful.
According to the National Institutes for Science and Technology, analyzing customer data to hone demand forecasting may save the U.S. up to $2 trillion by 2030.
California’s AutoGrid takes the process a step further, integrating utility systems directly with those of larger customers, allowing non-critical loads to be powered down during periods of peak demand.
Renewable Energy Relies on Data Science to Manage Point Sources of Power
Demand prediction and the smart grid are also vital to scaling the use of renewable energy sources. To manage distributed generation (such as solar panels installed at private homes) requires a smart grid, so power can be fed back into the system from sites that were once only drawing electricity. Managing thousands of point sources of power, with varying characteristics, is a key job for the data scientists that work for electrical utility companies.
Renewable sources also tend to be unpredictable – solar is available only during daylight hours and produces varying amounts of electricity based on sun angle and cloud cover; wind power comes only when the wind blows—so managing generation and load capacity when relying on such sources becomes even more complex.
The U.S. Commerce department forecasts that such power sources will make up almost 30% of U.S. generating capacity by 2030. To get there will require a lot of work as data scientists model and integrate renewables into the mainstream grid.
Algorithms as Detectives
Working in data science at a power utility can also involve a little detective work. Electrical utilities have always had to contend with customers bypassing meters to, in effect, “steal” power. But even a smart meter can only reveal that less energy is being consumed than before– it can’t say if that is because it was bypassed or because the customer installed more efficient light bulbs.
A 2015 article in Intelligent Utility magazine describes work being done by a team of data scientists at C3 energy who have been developing algorithms capable of detecting power theft with a better than 90 percent accuracy rate.