Hacker News new | past | comments | ask | show | jobs | submit login

Love it. Using it for load balancing of substation transformers. The grid is a graph.



Interesting you should say that, as I am trying to start a project where I need to make an electric grid graph, but I am not sure where to find the node/edge data for substations and transmission lines that include their specs and capacities. Is that stuff open source somewhere, like with the ISOs, or do you need to build it from scratch?


That data is privately held by utilities, and the high voltage transmission infrastructure is highly confidential (CEII/NERC CIP). If you just need sample data, I'd recommend checking out the test data provided with power flow simulators like OpenDSS for distribution systems [1] or MATPOWER for generation + transmission [2]. The IEEE test systems are what are used in research, they have the component specs you're looking for, and are provided with those tools.

[1] https://sourceforge.net/projects/electricdss/ [2] https://matpower.org


MATPOWER and OpenDSS test cases are also available in CSV format:

https://github.com/casecsv

https://github.com/cktcsv


Thanks to you and the next commenter up, these synthetic data sets are perfect for my current use case.


Have a look at IEC CIM (Common Information Model)

  - https://en.m.wikipedia.org/wiki/Common_Information_Model_(electricity)
  - https://zepben.bitbucket.io/cim/cim100/
  - https://ontology.tno.nl/IEC_CIM/


Thanks!


I have tried to talk to engineers about contingency analyses for what would happen if a unit went down, and they tend to have very wishy washy answers. Or giant tediously compiled reports that can model exactly one change.

Any idea why that is?


Good question. Are you talking about generation, transmission or distribution?

From my experience as an electrical engineer working for a distribution network:

* The traditional approach to network planning: take your edge cases (e.g. winter peak demand), and apply your engineering knowledge and intuition to manually study the most onerous outage conditions.

* This will vary on where you are in the world, but networks tend to have a good amount of slack built in.

* As networks become more complex, and the cost of computing has fallen, it's more feasible to automate contingency analysis (think about the number of different outage combinations for an N-2 scenario).

FWIW, the internal tools that I work on makes use of networkx to determine contingency cases.


I meant transmission but I’m interested in both.

Can you say a bit more about built in slack? You mean like Distribution Automation switches to backfeed an area? This has felt sort of rare to me


The security of supply standards are conservative. Good for reliability, but this comes at a price. This report goes in to more detail: https://www.dcode.org.uk/assets/uploads/IC_Report_exec_summa...

~20 years ago, the regulator introduced an incentive scheme to reduce customer interruptions and minutes lost. This resulted in heavy investment in network automation in the UK.


A single unit would correspond to a N-1 case when doing a transmission system study. There are ways of automating steady state analysis for this case to do a full sweep across the nearby system (either looking at k hops away, all parts in a zone (where a zone has a specific meaning in this context), or using a utility provided set of assets for the analysis). This pretty much consists of running a load flow for each individual case and compiling the results while making sure they are valid (convergence, device behavior, etc).

This is only the steady state analysis, but there's also dynamics done when looking at specific generators also to look at a generator's response to fluctuations in voltage and frequency to ensure stability within certain operating conditions (weakening of the grid, rapid change in voltage or freq).

If they were wishy washy they were probably limited to doing distribution where you are assuming a single strong source (swing bus) at the substation and it's not your responsibility to think too much about adjusting the system behavior based off of changes in transmission (usually)


I mean transmission. And by wishy washy I mean they have the reports but it’s not compiled into any sort of useful system so they are not able to quickly answer questions about it.

Most seem to outsource this analysis. I’m curious if you have a sense for how common it is for a transmission utility to really own this kind of analytics?


Not in the field, but don't forget the grid is dynamic and has feedback loops, control algorithms and humans in the loop.

It's closer to an unstable chaotic system which needs constant balancing and tweaking.


Is that true? Surely things regress to a stable mean most of the time of a few archetypes, no?


Yes. It’s generally stable in most localities. “Instabilities” in this system are browns outs power failures and other events. There are stabilising features within most electricity grids, but they can only cope so much. In general forward planning is down so the amount of dynamic adjustment needed is within allowable range.

But to be honest i don’t know how modern grids have adapted with many more micro generators than in the old days.


V. cool! It's off topic, but what do the algorithms look like in that task?




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: