mirror of
https://github.com/freqtrade/freqtrade.git
synced 2024-11-10 10:21:59 +00:00
Fix Case of gym.Env in documentation
This commit is contained in:
parent
6134807c67
commit
b93c6235c1
|
@ -20,7 +20,7 @@ With the current framework, we aim to expose the training environment via the co
|
|||
|
||||
We envision the majority of users focusing their effort on creative design of the `calculate_reward()` function [details here](#creating-a-custom-reward-function), while leaving the rest of the environment untouched. Other users may not touch the environment at all, and they will only play with the configuration settings and the powerful feature engineering that already exists in FreqAI. Meanwhile, we enable advanced users to create their own model classes entirely.
|
||||
|
||||
The framework is built on stable_baselines3 (torch) and OpenAI gym for the base environment class. But generally speaking, the model class is well isolated. Thus, the addition of competing libraries can be easily integrated into the existing framework. For the environment, it is inheriting from `gym.env` which means that it is necessary to write an entirely new environment in order to switch to a different library.
|
||||
The framework is built on stable_baselines3 (torch) and OpenAI gym for the base environment class. But generally speaking, the model class is well isolated. Thus, the addition of competing libraries can be easily integrated into the existing framework. For the environment, it is inheriting from `gym.Env` which means that it is necessary to write an entirely new environment in order to switch to a different library.
|
||||
|
||||
### Important considerations
|
||||
|
||||
|
@ -173,7 +173,7 @@ class MyCoolRLModel(ReinforcementLearner):
|
|||
"""
|
||||
class MyRLEnv(Base5ActionRLEnv):
|
||||
"""
|
||||
User made custom environment. This class inherits from BaseEnvironment and gym.env.
|
||||
User made custom environment. This class inherits from BaseEnvironment and gym.Env.
|
||||
Users can override any functions from those parent classes. Here is an example
|
||||
of a user customized `calculate_reward()` function.
|
||||
|
||||
|
@ -254,7 +254,7 @@ FreqAI also provides a built in episodic summary logger called `self.tensorboard
|
|||
```python
|
||||
class MyRLEnv(Base5ActionRLEnv):
|
||||
"""
|
||||
User made custom environment. This class inherits from BaseEnvironment and gym.env.
|
||||
User made custom environment. This class inherits from BaseEnvironment and gym.Env.
|
||||
Users can override any functions from those parent classes. Here is an example
|
||||
of a user customized `calculate_reward()` function.
|
||||
"""
|
||||
|
|
Loading…
Reference in New Issue
Block a user