This paper addresses the possibility of capacity withholding by energy producers, who seek to increase the market price and their own profits. The energy market is simulated as an iterative game, where each state game corresponds to an hourly energy auction with uniform pricing mechanism. The producers are modeled as agents that interact with their environment through reinforcement learning (RL) algorithm. Each producer submits step-wise offer curves, which include the quantity-price pairs, to independent system operator (ISO) under incomplete information. An experimental change is employed in the producer's profit maximization model that causes the iterative algorithm converge to a withholding bidding value. The producer can withhold the energy of his own generating unit in a continuous range of its available capacity. The RL relation is developed to prevent from becoming invalid in certain situations. The results on a small test system demonstrate the emergence of the capacity withholding by the producers and its effect on the market price.
Rights and permissions | |
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License. |