mirror of
https://github.com/freqtrade/freqtrade.git
synced 2024-11-10 10:21:59 +00:00
Merge branch 'develop' into list-pairs2
This commit is contained in:
commit
1bc63288a3
|
@ -1,4 +1,4 @@
|
||||||
FROM python:3.7.4-slim-stretch
|
FROM python:3.7.5-slim-stretch
|
||||||
|
|
||||||
RUN apt-get update \
|
RUN apt-get update \
|
||||||
&& apt-get -y install curl build-essential libssl-dev \
|
&& apt-get -y install curl build-essential libssl-dev \
|
||||||
|
@ -16,9 +16,9 @@ RUN cd /tmp && /tmp/install_ta-lib.sh && rm -r /tmp/*ta-lib*
|
||||||
ENV LD_LIBRARY_PATH /usr/local/lib
|
ENV LD_LIBRARY_PATH /usr/local/lib
|
||||||
|
|
||||||
# Install dependencies
|
# Install dependencies
|
||||||
COPY requirements.txt requirements-common.txt /freqtrade/
|
COPY requirements.txt requirements-common.txt requirements-hyperopt.txt /freqtrade/
|
||||||
RUN pip install numpy --no-cache-dir \
|
RUN pip install numpy --no-cache-dir \
|
||||||
&& pip install -r requirements.txt --no-cache-dir
|
&& pip install -r requirements-hyperopt.txt --no-cache-dir
|
||||||
|
|
||||||
# Install and execute
|
# Install and execute
|
||||||
COPY . /freqtrade/
|
COPY . /freqtrade/
|
||||||
|
|
|
@ -70,5 +70,6 @@
|
||||||
"forcebuy_enable": false,
|
"forcebuy_enable": false,
|
||||||
"internals": {
|
"internals": {
|
||||||
"process_throttle_secs": 5
|
"process_throttle_secs": 5
|
||||||
}
|
},
|
||||||
|
"download_trades": true
|
||||||
}
|
}
|
||||||
|
|
|
@ -103,12 +103,6 @@ The full timerange specification:
|
||||||
- Use tickframes since 2018/01/31 till 2018/03/01 : `--timerange=20180131-20180301`
|
- Use tickframes since 2018/01/31 till 2018/03/01 : `--timerange=20180131-20180301`
|
||||||
- Use tickframes between POSIX timestamps 1527595200 1527618600:
|
- Use tickframes between POSIX timestamps 1527595200 1527618600:
|
||||||
`--timerange=1527595200-1527618600`
|
`--timerange=1527595200-1527618600`
|
||||||
- Use last 123 tickframes of data: `--timerange=-123`
|
|
||||||
- Use first 123 tickframes of data: `--timerange=123-`
|
|
||||||
- Use tickframes from line 123 through 456: `--timerange=123-456`
|
|
||||||
|
|
||||||
!!! warning
|
|
||||||
Be carefull when using non-date functions - these do not allow you to specify precise dates, so if you updated the test-data it will probably use a different dataset.
|
|
||||||
|
|
||||||
## Understand the backtesting result
|
## Understand the backtesting result
|
||||||
|
|
||||||
|
@ -195,6 +189,7 @@ Hence, keep in mind that your performance is an integral mix of all different el
|
||||||
Since backtesting lacks some detailed information about what happens within a candle, it needs to take a few assumptions:
|
Since backtesting lacks some detailed information about what happens within a candle, it needs to take a few assumptions:
|
||||||
|
|
||||||
- Buys happen at open-price
|
- Buys happen at open-price
|
||||||
|
- Sell signal sells happen at open-price of the following candle
|
||||||
- Low happens before high for stoploss, protecting capital first.
|
- Low happens before high for stoploss, protecting capital first.
|
||||||
- ROI sells are compared to high - but the ROI value is used (e.g. ROI = 2%, high=5% - so the sell will be at 2%)
|
- ROI sells are compared to high - but the ROI value is used (e.g. ROI = 2%, high=5% - so the sell will be at 2%)
|
||||||
- Stoploss sells happen exactly at stoploss price, even if low was lower
|
- Stoploss sells happen exactly at stoploss price, even if low was lower
|
||||||
|
@ -203,6 +198,9 @@ Since backtesting lacks some detailed information about what happens within a ca
|
||||||
- Low uses the adjusted stoploss (so sells with large high-low difference are backtested correctly)
|
- Low uses the adjusted stoploss (so sells with large high-low difference are backtested correctly)
|
||||||
- Sell-reason does not explain if a trade was positive or negative, just what triggered the sell (this can look odd if negative ROI values are used)
|
- Sell-reason does not explain if a trade was positive or negative, just what triggered the sell (this can look odd if negative ROI values are used)
|
||||||
|
|
||||||
|
Taking these assumptions, backtesting tries to mirror real trading as closely as possible. However, backtesting will **never** replace running a strategy in dry-run mode.
|
||||||
|
Also, keep in mind that past results don't guarantee future success.
|
||||||
|
|
||||||
### Further backtest-result analysis
|
### Further backtest-result analysis
|
||||||
|
|
||||||
To further analyze your backtest results, you can [export the trades](#exporting-trades-to-file).
|
To further analyze your backtest results, you can [export the trades](#exporting-trades-to-file).
|
||||||
|
|
|
@ -106,7 +106,7 @@ user_data/
|
||||||
├── backtest_results
|
├── backtest_results
|
||||||
├── data
|
├── data
|
||||||
├── hyperopts
|
├── hyperopts
|
||||||
├── hyperopts_results
|
├── hyperopt_results
|
||||||
├── plot
|
├── plot
|
||||||
└── strategies
|
└── strategies
|
||||||
```
|
```
|
||||||
|
@ -256,7 +256,7 @@ optional arguments:
|
||||||
entry and exit).
|
entry and exit).
|
||||||
--customhyperopt NAME
|
--customhyperopt NAME
|
||||||
Specify hyperopt class name (default:
|
Specify hyperopt class name (default:
|
||||||
`DefaultHyperOpts`).
|
`DefaultHyperOpt`).
|
||||||
--hyperopt-path PATH Specify additional lookup path for Hyperopts and
|
--hyperopt-path PATH Specify additional lookup path for Hyperopts and
|
||||||
Hyperopt Loss functions.
|
Hyperopt Loss functions.
|
||||||
--eps, --enable-position-stacking
|
--eps, --enable-position-stacking
|
||||||
|
|
|
@ -38,7 +38,7 @@ Mixing different stake-currencies is allowed for this file, since it's only used
|
||||||
]
|
]
|
||||||
```
|
```
|
||||||
|
|
||||||
### start download
|
### Start download
|
||||||
|
|
||||||
Then run:
|
Then run:
|
||||||
|
|
||||||
|
@ -57,6 +57,32 @@ This will download ticker data for all the currency pairs you defined in `pairs.
|
||||||
- Use `--timeframes` to specify which tickers to download. Default is `--timeframes 1m 5m` which will download 1-minute and 5-minute tickers.
|
- Use `--timeframes` to specify which tickers to download. Default is `--timeframes 1m 5m` which will download 1-minute and 5-minute tickers.
|
||||||
- To use exchange, timeframe and list of pairs as defined in your configuration file, use the `-c/--config` option. With this, the script uses the whitelist defined in the config as the list of currency pairs to download data for and does not require the pairs.json file. You can combine `-c/--config` with most other options.
|
- To use exchange, timeframe and list of pairs as defined in your configuration file, use the `-c/--config` option. With this, the script uses the whitelist defined in the config as the list of currency pairs to download data for and does not require the pairs.json file. You can combine `-c/--config` with most other options.
|
||||||
|
|
||||||
|
### Trades (tick) data
|
||||||
|
|
||||||
|
By default, `download-data` subcommand downloads Candles (OHLCV) data. Some exchanges also provide historic trade-data via their API.
|
||||||
|
This data can be useful if you need many different timeframes, since it is only downloaded once, and then resampled locally to the desired timeframes.
|
||||||
|
|
||||||
|
Since this data is large by default, the files use gzip by default. They are stored in your data-directory with the naming convention of `<pair>-trades.json.gz` (`ETH_BTC-trades.json.gz`). Incremental mode is also supported, as for historic OHLCV data, so downloading the data once per week with `--days 8` will create an incremental data-repository.
|
||||||
|
|
||||||
|
To use this mode, simply add `--dl-trades` to your call. This will swap the download method to download trades, and resamples the data locally.
|
||||||
|
|
||||||
|
Example call:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
freqtrade download-data --exchange binance --pairs XRP/ETH ETH/BTC --days 20 --dl-trades
|
||||||
|
```
|
||||||
|
|
||||||
|
!!! Note
|
||||||
|
While this method uses async calls, it will be slow, since it requires the result of the previous call to generate the next request to the exchange.
|
||||||
|
|
||||||
|
!!! Warning
|
||||||
|
The historic trades are not available during Freqtrade dry-run and live trade modes because all exchanges tested provide this data with a delay of few 100 candles, so it's not suitable for real-time trading.
|
||||||
|
|
||||||
|
### Historic Kraken data
|
||||||
|
|
||||||
|
The Kraken API does only provide 720 historic candles, which is sufficient for FreqTrade dry-run and live trade modes, but is a problem for backtesting.
|
||||||
|
To download data for the Kraken exchange, using `--dl-trades` is mandatory, otherwise the bot will download the same 720 candles over and over, and you'll not have enough backtest data.
|
||||||
|
|
||||||
## Next step
|
## Next step
|
||||||
|
|
||||||
Great, you now have backtest data downloaded, so you can now start [backtesting](backtesting.md) your strategy.
|
Great, you now have backtest data downloaded, so you can now start [backtesting](backtesting.md) your strategy.
|
||||||
|
|
|
@ -249,13 +249,10 @@ freqtrade edge --stoplosses=-0.01,-0.1,-0.001 #min,max,step
|
||||||
freqtrade edge --timerange=20181110-20181113
|
freqtrade edge --timerange=20181110-20181113
|
||||||
```
|
```
|
||||||
|
|
||||||
Doing `--timerange=-200` will get the last 200 timeframes from your inputdata. You can also specify specific dates, or a range span indexed by start and stop.
|
Doing `--timerange=-20190901` will get all available data until September 1st (excluding September 1st 2019).
|
||||||
|
|
||||||
The full timerange specification:
|
The full timerange specification:
|
||||||
|
|
||||||
* Use last 123 tickframes of data: `--timerange=-123`
|
|
||||||
* Use first 123 tickframes of data: `--timerange=123-`
|
|
||||||
* Use tickframes from line 123 through 456: `--timerange=123-456`
|
|
||||||
* Use tickframes till 2018/01/31: `--timerange=-20180131`
|
* Use tickframes till 2018/01/31: `--timerange=-20180131`
|
||||||
* Use tickframes since 2018/01/31: `--timerange=20180131-`
|
* Use tickframes since 2018/01/31: `--timerange=20180131-`
|
||||||
* Use tickframes since 2018/01/31 till 2018/03/01 : `--timerange=20180131-20180301`
|
* Use tickframes since 2018/01/31 till 2018/03/01 : `--timerange=20180131-20180301`
|
||||||
|
|
|
@ -38,7 +38,7 @@ like pauses. You can stop your bot, adjust settings and start it again.
|
||||||
|
|
||||||
### I want to improve the bot with a new strategy
|
### I want to improve the bot with a new strategy
|
||||||
|
|
||||||
That's great. We have a nice backtesting and hyperoptimizing setup. See
|
That's great. We have a nice backtesting and hyperoptimization setup. See
|
||||||
the tutorial [here|Testing-new-strategies-with-Hyperopt](bot-usage.md#hyperopt-commands).
|
the tutorial [here|Testing-new-strategies-with-Hyperopt](bot-usage.md#hyperopt-commands).
|
||||||
|
|
||||||
### Is there a setting to only SELL the coins being held and not perform anymore BUYS?
|
### Is there a setting to only SELL the coins being held and not perform anymore BUYS?
|
||||||
|
@ -59,7 +59,7 @@ If you're a US customer, the bot will fail to create orders for these pairs, and
|
||||||
|
|
||||||
### How many epoch do I need to get a good Hyperopt result?
|
### How many epoch do I need to get a good Hyperopt result?
|
||||||
|
|
||||||
Per default Hyperopts without `-e` or `--epochs` parameter will only
|
Per default Hyperopt called without the `-e`/`--epochs` command line option will only
|
||||||
run 100 epochs, means 100 evals of your triggers, guards, ... Too few
|
run 100 epochs, means 100 evals of your triggers, guards, ... Too few
|
||||||
to find a great result (unless if you are very lucky), so you probably
|
to find a great result (unless if you are very lucky), so you probably
|
||||||
have to run it for 10.000 or more. But it will take an eternity to
|
have to run it for 10.000 or more. But it will take an eternity to
|
||||||
|
|
|
@ -10,12 +10,12 @@ Hyperopt requires historic data to be available, just as backtesting does.
|
||||||
To learn how to get data for the pairs and exchange you're interrested in, head over to the [Data Downloading](data-download.md) section of the documentation.
|
To learn how to get data for the pairs and exchange you're interrested in, head over to the [Data Downloading](data-download.md) section of the documentation.
|
||||||
|
|
||||||
!!! Bug
|
!!! Bug
|
||||||
Hyperopt will crash when used with only 1 CPU Core as found out in [Issue #1133](https://github.com/freqtrade/freqtrade/issues/1133)
|
Hyperopt can crash when used with only 1 CPU Core as found out in [Issue #1133](https://github.com/freqtrade/freqtrade/issues/1133)
|
||||||
|
|
||||||
## Prepare Hyperopting
|
## Prepare Hyperopting
|
||||||
|
|
||||||
Before we start digging into Hyperopt, we recommend you to take a look at
|
Before we start digging into Hyperopt, we recommend you to take a look at
|
||||||
an example hyperopt file located into [user_data/hyperopts/](https://github.com/freqtrade/freqtrade/blob/develop/user_data/hyperopts/sample_hyperopt.py)
|
the sample hyperopt file located in [user_data/hyperopts/](https://github.com/freqtrade/freqtrade/blob/develop/user_data/hyperopts/sample_hyperopt.py).
|
||||||
|
|
||||||
Configuring hyperopt is similar to writing your own strategy, and many tasks will be similar and a lot of code can be copied across from the strategy.
|
Configuring hyperopt is similar to writing your own strategy, and many tasks will be similar and a lot of code can be copied across from the strategy.
|
||||||
|
|
||||||
|
@ -64,9 +64,9 @@ multiple guards. The constructed strategy will be something like
|
||||||
"*buy exactly when close price touches lower bollinger band, BUT only if
|
"*buy exactly when close price touches lower bollinger band, BUT only if
|
||||||
ADX > 10*".
|
ADX > 10*".
|
||||||
|
|
||||||
If you have updated the buy strategy, ie. changed the contents of
|
If you have updated the buy strategy, i.e. changed the contents of
|
||||||
`populate_buy_trend()` method you have to update the `guards` and
|
`populate_buy_trend()` method, you have to update the `guards` and
|
||||||
`triggers` hyperopts must use.
|
`triggers` your hyperopt must use correspondingly.
|
||||||
|
|
||||||
#### Sell optimization
|
#### Sell optimization
|
||||||
|
|
||||||
|
@ -82,7 +82,7 @@ To avoid naming collisions in the search-space, please prefix all sell-spaces wi
|
||||||
#### Using ticker-interval as part of the Strategy
|
#### Using ticker-interval as part of the Strategy
|
||||||
|
|
||||||
The Strategy exposes the ticker-interval as `self.ticker_interval`. The same value is available as class-attribute `HyperoptName.ticker_interval`.
|
The Strategy exposes the ticker-interval as `self.ticker_interval`. The same value is available as class-attribute `HyperoptName.ticker_interval`.
|
||||||
In the case of the linked sample-value this would be `SampleHyperOpts.ticker_interval`.
|
In the case of the linked sample-value this would be `SampleHyperOpt.ticker_interval`.
|
||||||
|
|
||||||
## Solving a Mystery
|
## Solving a Mystery
|
||||||
|
|
||||||
|
|
|
@ -1 +1,2 @@
|
||||||
mkdocs-material==4.4.3
|
mkdocs-material==4.4.3
|
||||||
|
mdx_truly_sane_lists==1.2
|
||||||
|
|
|
@ -138,15 +138,19 @@ def populate_buy_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame:
|
||||||
"""
|
"""
|
||||||
dataframe.loc[
|
dataframe.loc[
|
||||||
(
|
(
|
||||||
(dataframe['adx'] > 30) &
|
(qtpylib.crossed_above(dataframe['rsi'], 30)) & # Signal: RSI crosses above 30
|
||||||
(dataframe['tema'] <= dataframe['bb_middleband']) &
|
(dataframe['tema'] <= dataframe['bb_middleband']) & # Guard
|
||||||
(dataframe['tema'] > dataframe['tema'].shift(1))
|
(dataframe['tema'] > dataframe['tema'].shift(1)) & # Guard
|
||||||
|
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
||||||
),
|
),
|
||||||
'buy'] = 1
|
'buy'] = 1
|
||||||
|
|
||||||
return dataframe
|
return dataframe
|
||||||
```
|
```
|
||||||
|
|
||||||
|
!!! Note
|
||||||
|
Buying requires sellers to buy from - therefore volume needs to be > 0 (`dataframe['volume'] > 0`) to make sure that the bot does not buy/sell in no-activity periods.
|
||||||
|
|
||||||
### Sell signal rules
|
### Sell signal rules
|
||||||
|
|
||||||
Edit the method `populate_sell_trend()` into your strategy file to update your sell strategy.
|
Edit the method `populate_sell_trend()` into your strategy file to update your sell strategy.
|
||||||
|
@ -168,9 +172,10 @@ def populate_sell_trend(self, dataframe: DataFrame, metadata: dict) -> DataFrame
|
||||||
"""
|
"""
|
||||||
dataframe.loc[
|
dataframe.loc[
|
||||||
(
|
(
|
||||||
(dataframe['adx'] > 70) &
|
(qtpylib.crossed_above(dataframe['rsi'], 70)) & # Signal: RSI crosses above 70
|
||||||
(dataframe['tema'] > dataframe['bb_middleband']) &
|
(dataframe['tema'] > dataframe['bb_middleband']) & # Guard
|
||||||
(dataframe['tema'] < dataframe['tema'].shift(1))
|
(dataframe['tema'] < dataframe['tema'].shift(1)) & # Guard
|
||||||
|
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
||||||
),
|
),
|
||||||
'sell'] = 1
|
'sell'] = 1
|
||||||
return dataframe
|
return dataframe
|
||||||
|
|
|
@ -39,7 +39,8 @@ ARGS_LIST_PAIRS = ["exchange", "print_list", "list_pairs_print_json", "print_one
|
||||||
|
|
||||||
ARGS_CREATE_USERDIR = ["user_data_dir"]
|
ARGS_CREATE_USERDIR = ["user_data_dir"]
|
||||||
|
|
||||||
ARGS_DOWNLOAD_DATA = ["pairs", "pairs_file", "days", "exchange", "timeframes", "erase"]
|
ARGS_DOWNLOAD_DATA = ["pairs", "pairs_file", "days", "download_trades", "exchange",
|
||||||
|
"timeframes", "erase"]
|
||||||
|
|
||||||
ARGS_PLOT_DATAFRAME = ["pairs", "indicators1", "indicators2", "plot_limit", "db_url",
|
ARGS_PLOT_DATAFRAME = ["pairs", "indicators1", "indicators2", "plot_limit", "db_url",
|
||||||
"trade_source", "export", "exportfilename", "timerange", "ticker_interval"]
|
"trade_source", "export", "exportfilename", "timerange", "ticker_interval"]
|
||||||
|
|
|
@ -2,7 +2,6 @@
|
||||||
Definition of cli arguments used in arguments.py
|
Definition of cli arguments used in arguments.py
|
||||||
"""
|
"""
|
||||||
import argparse
|
import argparse
|
||||||
import os
|
|
||||||
|
|
||||||
from freqtrade import __version__, constants
|
from freqtrade import __version__, constants
|
||||||
|
|
||||||
|
@ -141,8 +140,6 @@ AVAILABLE_CLI_OPTIONS = {
|
||||||
'Requires `--export` to be set as well. '
|
'Requires `--export` to be set as well. '
|
||||||
'Example: `--export-filename=user_data/backtest_results/backtest_today.json`',
|
'Example: `--export-filename=user_data/backtest_results/backtest_today.json`',
|
||||||
metavar='PATH',
|
metavar='PATH',
|
||||||
default=os.path.join('user_data', 'backtest_results',
|
|
||||||
'backtest-result.json'),
|
|
||||||
),
|
),
|
||||||
"fee": Arg(
|
"fee": Arg(
|
||||||
'--fee',
|
'--fee',
|
||||||
|
@ -309,6 +306,12 @@ AVAILABLE_CLI_OPTIONS = {
|
||||||
type=check_int_positive,
|
type=check_int_positive,
|
||||||
metavar='INT',
|
metavar='INT',
|
||||||
),
|
),
|
||||||
|
"download_trades": Arg(
|
||||||
|
'--dl-trades',
|
||||||
|
help='Download trades instead of OHLCV data. The bot will resample trades to the '
|
||||||
|
'desired timeframe as specified as --timeframes/-t.',
|
||||||
|
action='store_true',
|
||||||
|
),
|
||||||
"exchange": Arg(
|
"exchange": Arg(
|
||||||
'--exchange',
|
'--exchange',
|
||||||
help=f'Exchange name (default: `{constants.DEFAULT_EXCHANGE}`). '
|
help=f'Exchange name (default: `{constants.DEFAULT_EXCHANGE}`). '
|
||||||
|
|
|
@ -192,6 +192,13 @@ class Configuration:
|
||||||
config.update({'datadir': create_datadir(config, self.args.get("datadir", None))})
|
config.update({'datadir': create_datadir(config, self.args.get("datadir", None))})
|
||||||
logger.info('Using data directory: %s ...', config.get('datadir'))
|
logger.info('Using data directory: %s ...', config.get('datadir'))
|
||||||
|
|
||||||
|
if self.args.get('exportfilename'):
|
||||||
|
self._args_to_config(config, argname='exportfilename',
|
||||||
|
logstring='Storing backtest results to {} ...')
|
||||||
|
else:
|
||||||
|
config['exportfilename'] = (config['user_data_dir']
|
||||||
|
/ 'backtest_results/backtest-result.json')
|
||||||
|
|
||||||
def _process_optimize_options(self, config: Dict[str, Any]) -> None:
|
def _process_optimize_options(self, config: Dict[str, Any]) -> None:
|
||||||
|
|
||||||
# This will override the strategy configuration
|
# This will override the strategy configuration
|
||||||
|
@ -235,9 +242,6 @@ class Configuration:
|
||||||
self._args_to_config(config, argname='export',
|
self._args_to_config(config, argname='export',
|
||||||
logstring='Parameter --export detected: {} ...')
|
logstring='Parameter --export detected: {} ...')
|
||||||
|
|
||||||
self._args_to_config(config, argname='exportfilename',
|
|
||||||
logstring='Storing backtest results to {} ...')
|
|
||||||
|
|
||||||
# Edge section:
|
# Edge section:
|
||||||
if 'stoploss_range' in self.args and self.args["stoploss_range"]:
|
if 'stoploss_range' in self.args and self.args["stoploss_range"]:
|
||||||
txt_range = eval(self.args["stoploss_range"])
|
txt_range = eval(self.args["stoploss_range"])
|
||||||
|
@ -312,6 +316,8 @@ class Configuration:
|
||||||
|
|
||||||
self._args_to_config(config, argname='days',
|
self._args_to_config(config, argname='days',
|
||||||
logstring='Detected --days: {}')
|
logstring='Detected --days: {}')
|
||||||
|
self._args_to_config(config, argname='download_trades',
|
||||||
|
logstring='Detected --dl-trades: {}')
|
||||||
|
|
||||||
def _process_runmode(self, config: Dict[str, Any]) -> None:
|
def _process_runmode(self, config: Dict[str, Any]) -> None:
|
||||||
|
|
||||||
|
|
|
@ -42,9 +42,10 @@ class TimeRange:
|
||||||
(r'^-(\d{10})$', (None, 'date')),
|
(r'^-(\d{10})$', (None, 'date')),
|
||||||
(r'^(\d{10})-$', ('date', None)),
|
(r'^(\d{10})-$', ('date', None)),
|
||||||
(r'^(\d{10})-(\d{10})$', ('date', 'date')),
|
(r'^(\d{10})-(\d{10})$', ('date', 'date')),
|
||||||
(r'^(-\d+)$', (None, 'line')),
|
(r'^-(\d{13})$', (None, 'date')),
|
||||||
(r'^(\d+)-$', ('line', None)),
|
(r'^(\d{13})-$', ('date', None)),
|
||||||
(r'^(\d+)-(\d+)$', ('index', 'index'))]
|
(r'^(\d{13})-(\d{13})$', ('date', 'date')),
|
||||||
|
]
|
||||||
for rex, stype in syntax:
|
for rex, stype in syntax:
|
||||||
# Apply the regular expression to text
|
# Apply the regular expression to text
|
||||||
match = re.match(rex, text)
|
match = re.match(rex, text)
|
||||||
|
@ -57,6 +58,8 @@ class TimeRange:
|
||||||
starts = rvals[index]
|
starts = rvals[index]
|
||||||
if stype[0] == 'date' and len(starts) == 8:
|
if stype[0] == 'date' and len(starts) == 8:
|
||||||
start = arrow.get(starts, 'YYYYMMDD').timestamp
|
start = arrow.get(starts, 'YYYYMMDD').timestamp
|
||||||
|
elif len(starts) == 13:
|
||||||
|
start = int(starts) // 1000
|
||||||
else:
|
else:
|
||||||
start = int(starts)
|
start = int(starts)
|
||||||
index += 1
|
index += 1
|
||||||
|
@ -64,6 +67,8 @@ class TimeRange:
|
||||||
stops = rvals[index]
|
stops = rvals[index]
|
||||||
if stype[1] == 'date' and len(stops) == 8:
|
if stype[1] == 'date' and len(stops) == 8:
|
||||||
stop = arrow.get(stops, 'YYYYMMDD').timestamp
|
stop = arrow.get(stops, 'YYYYMMDD').timestamp
|
||||||
|
elif len(stops) == 13:
|
||||||
|
stop = int(stops) // 1000
|
||||||
else:
|
else:
|
||||||
stop = int(stops)
|
stop = int(stops)
|
||||||
return TimeRange(stype[0], stype[1], start, stop)
|
return TimeRange(stype[0], stype[1], start, stop)
|
||||||
|
|
|
@ -10,7 +10,7 @@ DEFAULT_TICKER_INTERVAL = 5 # min
|
||||||
HYPEROPT_EPOCH = 100 # epochs
|
HYPEROPT_EPOCH = 100 # epochs
|
||||||
RETRY_TIMEOUT = 30 # sec
|
RETRY_TIMEOUT = 30 # sec
|
||||||
DEFAULT_STRATEGY = 'DefaultStrategy'
|
DEFAULT_STRATEGY = 'DefaultStrategy'
|
||||||
DEFAULT_HYPEROPT = 'DefaultHyperOpts'
|
DEFAULT_HYPEROPT = 'DefaultHyperOpt'
|
||||||
DEFAULT_HYPEROPT_LOSS = 'DefaultHyperOptLoss'
|
DEFAULT_HYPEROPT_LOSS = 'DefaultHyperOptLoss'
|
||||||
DEFAULT_DB_PROD_URL = 'sqlite:///tradesv3.sqlite'
|
DEFAULT_DB_PROD_URL = 'sqlite:///tradesv3.sqlite'
|
||||||
DEFAULT_DB_DRYRUN_URL = 'sqlite://'
|
DEFAULT_DB_DRYRUN_URL = 'sqlite://'
|
||||||
|
@ -266,6 +266,6 @@ CONF_SCHEMA = {
|
||||||
'stake_amount',
|
'stake_amount',
|
||||||
'dry_run',
|
'dry_run',
|
||||||
'bid_strategy',
|
'bid_strategy',
|
||||||
'telegram'
|
'unfilledtimeout',
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
|
|
|
@ -93,7 +93,7 @@ def load_trades_from_db(db_url: str) -> pd.DataFrame:
|
||||||
t.close_date.replace(tzinfo=pytz.UTC) if t.close_date else None,
|
t.close_date.replace(tzinfo=pytz.UTC) if t.close_date else None,
|
||||||
t.calc_profit(), t.calc_profit_percent(),
|
t.calc_profit(), t.calc_profit_percent(),
|
||||||
t.open_rate, t.close_rate, t.amount,
|
t.open_rate, t.close_rate, t.amount,
|
||||||
(t.close_date.timestamp() - t.open_date.timestamp()
|
(round((t.close_date.timestamp() - t.open_date.timestamp()) / 60, 2)
|
||||||
if t.close_date else None),
|
if t.close_date else None),
|
||||||
t.sell_reason,
|
t.sell_reason,
|
||||||
t.fee_open, t.fee_close,
|
t.fee_open, t.fee_close,
|
||||||
|
|
|
@ -114,3 +114,25 @@ def order_book_to_dataframe(bids: list, asks: list) -> DataFrame:
|
||||||
keys=['b_sum', 'b_size', 'bids', 'asks', 'a_size', 'a_sum'])
|
keys=['b_sum', 'b_size', 'bids', 'asks', 'a_size', 'a_sum'])
|
||||||
# logger.info('order book %s', frame )
|
# logger.info('order book %s', frame )
|
||||||
return frame
|
return frame
|
||||||
|
|
||||||
|
|
||||||
|
def trades_to_ohlcv(trades: list, timeframe: str) -> list:
|
||||||
|
"""
|
||||||
|
Converts trades list to ohlcv list
|
||||||
|
:param trades: List of trades, as returned by ccxt.fetch_trades.
|
||||||
|
:param timeframe: Ticker timeframe to resample data to
|
||||||
|
:return: ohlcv timeframe as list (as returned by ccxt.fetch_ohlcv)
|
||||||
|
"""
|
||||||
|
from freqtrade.exchange import timeframe_to_minutes
|
||||||
|
ticker_minutes = timeframe_to_minutes(timeframe)
|
||||||
|
df = pd.DataFrame(trades)
|
||||||
|
df['datetime'] = pd.to_datetime(df['datetime'])
|
||||||
|
df = df.set_index('datetime')
|
||||||
|
|
||||||
|
df_new = df['price'].resample(f'{ticker_minutes}min').ohlc()
|
||||||
|
df_new['volume'] = df['amount'].resample(f'{ticker_minutes}min').sum()
|
||||||
|
df_new['date'] = df_new.index.astype("int64") // 10 ** 6
|
||||||
|
# Drop 0 volume rows
|
||||||
|
df_new = df_new.dropna()
|
||||||
|
columns = ["date", "open", "high", "low", "close", "volume"]
|
||||||
|
return list(zip(*[df_new[x].values.tolist() for x in columns]))
|
||||||
|
|
|
@ -17,7 +17,7 @@ from pandas import DataFrame
|
||||||
|
|
||||||
from freqtrade import OperationalException, misc
|
from freqtrade import OperationalException, misc
|
||||||
from freqtrade.configuration import TimeRange
|
from freqtrade.configuration import TimeRange
|
||||||
from freqtrade.data.converter import parse_ticker_dataframe
|
from freqtrade.data.converter import parse_ticker_dataframe, trades_to_ohlcv
|
||||||
from freqtrade.exchange import Exchange, timeframe_to_minutes
|
from freqtrade.exchange import Exchange, timeframe_to_minutes
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
@ -33,20 +33,12 @@ def trim_tickerlist(tickerlist: List[Dict], timerange: TimeRange) -> List[Dict]:
|
||||||
start_index = 0
|
start_index = 0
|
||||||
stop_index = len(tickerlist)
|
stop_index = len(tickerlist)
|
||||||
|
|
||||||
if timerange.starttype == 'line':
|
if timerange.starttype == 'date':
|
||||||
stop_index = timerange.startts
|
|
||||||
if timerange.starttype == 'index':
|
|
||||||
start_index = timerange.startts
|
|
||||||
elif timerange.starttype == 'date':
|
|
||||||
while (start_index < len(tickerlist) and
|
while (start_index < len(tickerlist) and
|
||||||
tickerlist[start_index][0] < timerange.startts * 1000):
|
tickerlist[start_index][0] < timerange.startts * 1000):
|
||||||
start_index += 1
|
start_index += 1
|
||||||
|
|
||||||
if timerange.stoptype == 'line':
|
if timerange.stoptype == 'date':
|
||||||
start_index = max(len(tickerlist) + timerange.stopts, 0)
|
|
||||||
if timerange.stoptype == 'index':
|
|
||||||
stop_index = timerange.stopts
|
|
||||||
elif timerange.stoptype == 'date':
|
|
||||||
while (stop_index > 0 and
|
while (stop_index > 0 and
|
||||||
tickerlist[stop_index-1][0] > timerange.stopts * 1000):
|
tickerlist[stop_index-1][0] > timerange.stopts * 1000):
|
||||||
stop_index -= 1
|
stop_index -= 1
|
||||||
|
@ -82,6 +74,29 @@ def store_tickerdata_file(datadir: Path, pair: str,
|
||||||
misc.file_dump_json(filename, data, is_zip=is_zip)
|
misc.file_dump_json(filename, data, is_zip=is_zip)
|
||||||
|
|
||||||
|
|
||||||
|
def load_trades_file(datadir: Path, pair: str,
|
||||||
|
timerange: Optional[TimeRange] = None) -> List[Dict]:
|
||||||
|
"""
|
||||||
|
Load a pair from file, either .json.gz or .json
|
||||||
|
:return: tradelist or empty list if unsuccesful
|
||||||
|
"""
|
||||||
|
filename = pair_trades_filename(datadir, pair)
|
||||||
|
tradesdata = misc.file_load_json(filename)
|
||||||
|
if not tradesdata:
|
||||||
|
return []
|
||||||
|
|
||||||
|
return tradesdata
|
||||||
|
|
||||||
|
|
||||||
|
def store_trades_file(datadir: Path, pair: str,
|
||||||
|
data: list, is_zip: bool = True):
|
||||||
|
"""
|
||||||
|
Stores tickerdata to file
|
||||||
|
"""
|
||||||
|
filename = pair_trades_filename(datadir, pair)
|
||||||
|
misc.file_dump_json(filename, data, is_zip=is_zip)
|
||||||
|
|
||||||
|
|
||||||
def _validate_pairdata(pair, pairdata, timerange: TimeRange):
|
def _validate_pairdata(pair, pairdata, timerange: TimeRange):
|
||||||
if timerange.starttype == 'date' and pairdata[0][0] > timerange.startts * 1000:
|
if timerange.starttype == 'date' and pairdata[0][0] > timerange.startts * 1000:
|
||||||
logger.warning('Missing data at start for pair %s, data starts at %s',
|
logger.warning('Missing data at start for pair %s, data starts at %s',
|
||||||
|
@ -173,6 +188,12 @@ def pair_data_filename(datadir: Path, pair: str, ticker_interval: str) -> Path:
|
||||||
return filename
|
return filename
|
||||||
|
|
||||||
|
|
||||||
|
def pair_trades_filename(datadir: Path, pair: str) -> Path:
|
||||||
|
pair_s = pair.replace("/", "_")
|
||||||
|
filename = datadir.joinpath(f'{pair_s}-trades.json.gz')
|
||||||
|
return filename
|
||||||
|
|
||||||
|
|
||||||
def _load_cached_data_for_updating(datadir: Path, pair: str, ticker_interval: str,
|
def _load_cached_data_for_updating(datadir: Path, pair: str, ticker_interval: str,
|
||||||
timerange: Optional[TimeRange]) -> Tuple[List[Any],
|
timerange: Optional[TimeRange]) -> Tuple[List[Any],
|
||||||
Optional[int]]:
|
Optional[int]]:
|
||||||
|
@ -299,6 +320,92 @@ def refresh_backtest_ohlcv_data(exchange: Exchange, pairs: List[str], timeframes
|
||||||
return pairs_not_available
|
return pairs_not_available
|
||||||
|
|
||||||
|
|
||||||
|
def download_trades_history(datadir: Path,
|
||||||
|
exchange: Exchange,
|
||||||
|
pair: str,
|
||||||
|
timerange: Optional[TimeRange] = None) -> bool:
|
||||||
|
"""
|
||||||
|
Download trade history from the exchange.
|
||||||
|
Appends to previously downloaded trades data.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
|
||||||
|
since = timerange.startts * 1000 if timerange and timerange.starttype == 'date' else None
|
||||||
|
|
||||||
|
trades = load_trades_file(datadir, pair)
|
||||||
|
|
||||||
|
from_id = trades[-1]['id'] if trades else None
|
||||||
|
|
||||||
|
logger.debug("Current Start: %s", trades[0]['datetime'] if trades else 'None')
|
||||||
|
logger.debug("Current End: %s", trades[-1]['datetime'] if trades else 'None')
|
||||||
|
|
||||||
|
new_trades = exchange.get_historic_trades(pair=pair,
|
||||||
|
since=since if since else
|
||||||
|
int(arrow.utcnow().shift(
|
||||||
|
days=-30).float_timestamp) * 1000,
|
||||||
|
# until=xxx,
|
||||||
|
from_id=from_id,
|
||||||
|
)
|
||||||
|
trades.extend(new_trades[1])
|
||||||
|
store_trades_file(datadir, pair, trades)
|
||||||
|
|
||||||
|
logger.debug("New Start: %s", trades[0]['datetime'])
|
||||||
|
logger.debug("New End: %s", trades[-1]['datetime'])
|
||||||
|
logger.info(f"New Amount of trades: {len(trades)}")
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(
|
||||||
|
f'Failed to download historic trades for pair: "{pair}". '
|
||||||
|
f'Error: {e}'
|
||||||
|
)
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def refresh_backtest_trades_data(exchange: Exchange, pairs: List[str], datadir: Path,
|
||||||
|
timerange: TimeRange, erase=False) -> List[str]:
|
||||||
|
"""
|
||||||
|
Refresh stored trades data.
|
||||||
|
Used by freqtrade download-data
|
||||||
|
:return: Pairs not available
|
||||||
|
"""
|
||||||
|
pairs_not_available = []
|
||||||
|
for pair in pairs:
|
||||||
|
if pair not in exchange.markets:
|
||||||
|
pairs_not_available.append(pair)
|
||||||
|
logger.info(f"Skipping pair {pair}...")
|
||||||
|
continue
|
||||||
|
|
||||||
|
dl_file = pair_trades_filename(datadir, pair)
|
||||||
|
if erase and dl_file.exists():
|
||||||
|
logger.info(
|
||||||
|
f'Deleting existing data for pair {pair}.')
|
||||||
|
dl_file.unlink()
|
||||||
|
|
||||||
|
logger.info(f'Downloading trades for pair {pair}.')
|
||||||
|
download_trades_history(datadir=datadir, exchange=exchange,
|
||||||
|
pair=pair,
|
||||||
|
timerange=timerange)
|
||||||
|
return pairs_not_available
|
||||||
|
|
||||||
|
|
||||||
|
def convert_trades_to_ohlcv(pairs: List[str], timeframes: List[str],
|
||||||
|
datadir: Path, timerange: TimeRange, erase=False) -> None:
|
||||||
|
"""
|
||||||
|
Convert stored trades data to ohlcv data
|
||||||
|
"""
|
||||||
|
for pair in pairs:
|
||||||
|
trades = load_trades_file(datadir, pair)
|
||||||
|
for timeframe in timeframes:
|
||||||
|
ohlcv_file = pair_data_filename(datadir, pair, timeframe)
|
||||||
|
if erase and ohlcv_file.exists():
|
||||||
|
logger.info(f'Deleting existing data for pair {pair}, interval {timeframe}.')
|
||||||
|
ohlcv_file.unlink()
|
||||||
|
ohlcv = trades_to_ohlcv(trades, timeframe)
|
||||||
|
# Store ohlcv
|
||||||
|
store_tickerdata_file(datadir, pair, timeframe, data=ohlcv)
|
||||||
|
|
||||||
|
|
||||||
def get_timeframe(data: Dict[str, DataFrame]) -> Tuple[arrow.Arrow, arrow.Arrow]:
|
def get_timeframe(data: Dict[str, DataFrame]) -> Tuple[arrow.Arrow, arrow.Arrow]:
|
||||||
"""
|
"""
|
||||||
Get the maximum timeframe for the given backtest data
|
Get the maximum timeframe for the given backtest data
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
from freqtrade.exchange.exchange import Exchange # noqa: F401
|
from freqtrade.exchange.exchange import Exchange, MAP_EXCHANGE_CHILDCLASS # noqa: F401
|
||||||
from freqtrade.exchange.exchange import (get_exchange_bad_reason, # noqa: F401
|
from freqtrade.exchange.exchange import (get_exchange_bad_reason, # noqa: F401
|
||||||
is_exchange_bad,
|
is_exchange_bad,
|
||||||
is_exchange_known_ccxt,
|
is_exchange_known_ccxt,
|
||||||
|
|
|
@ -16,6 +16,8 @@ class Binance(Exchange):
|
||||||
_ft_has: Dict = {
|
_ft_has: Dict = {
|
||||||
"stoploss_on_exchange": True,
|
"stoploss_on_exchange": True,
|
||||||
"order_time_in_force": ['gtc', 'fok', 'ioc'],
|
"order_time_in_force": ['gtc', 'fok', 'ioc'],
|
||||||
|
"trades_pagination": "id",
|
||||||
|
"trades_pagination_arg": "fromId",
|
||||||
}
|
}
|
||||||
|
|
||||||
def get_order_book(self, pair: str, limit: int = 100) -> dict:
|
def get_order_book(self, pair: str, limit: int = 100) -> dict:
|
||||||
|
|
|
@ -103,6 +103,11 @@ BAD_EXCHANGES = {
|
||||||
], "Does not provide timeframes. ccxt fetchOHLCV: emulated"),
|
], "Does not provide timeframes. ccxt fetchOHLCV: emulated"),
|
||||||
}
|
}
|
||||||
|
|
||||||
|
MAP_EXCHANGE_CHILDCLASS = {
|
||||||
|
'binanceus': 'binance',
|
||||||
|
'binanceje': 'binance',
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
def retrier_async(f):
|
def retrier_async(f):
|
||||||
async def wrapper(*args, **kwargs):
|
async def wrapper(*args, **kwargs):
|
||||||
|
@ -143,6 +148,8 @@ def retrier(f):
|
||||||
class Exchange:
|
class Exchange:
|
||||||
|
|
||||||
_config: Dict = {}
|
_config: Dict = {}
|
||||||
|
|
||||||
|
# Parameters to add directly to buy/sell calls (like agreeing to trading agreement)
|
||||||
_params: Dict = {}
|
_params: Dict = {}
|
||||||
|
|
||||||
# Dict to specify which options each exchange implements
|
# Dict to specify which options each exchange implements
|
||||||
|
@ -153,6 +160,9 @@ class Exchange:
|
||||||
"order_time_in_force": ["gtc"],
|
"order_time_in_force": ["gtc"],
|
||||||
"ohlcv_candle_limit": 500,
|
"ohlcv_candle_limit": 500,
|
||||||
"ohlcv_partial_candle": True,
|
"ohlcv_partial_candle": True,
|
||||||
|
"trades_pagination": "time", # Possible are "time" or "id"
|
||||||
|
"trades_pagination_arg": "since",
|
||||||
|
|
||||||
}
|
}
|
||||||
_ft_has: Dict = {}
|
_ft_has: Dict = {}
|
||||||
|
|
||||||
|
@ -196,6 +206,9 @@ class Exchange:
|
||||||
self._ohlcv_candle_limit = self._ft_has['ohlcv_candle_limit']
|
self._ohlcv_candle_limit = self._ft_has['ohlcv_candle_limit']
|
||||||
self._ohlcv_partial_candle = self._ft_has['ohlcv_partial_candle']
|
self._ohlcv_partial_candle = self._ft_has['ohlcv_partial_candle']
|
||||||
|
|
||||||
|
self._trades_pagination = self._ft_has['trades_pagination']
|
||||||
|
self._trades_pagination_arg = self._ft_has['trades_pagination_arg']
|
||||||
|
|
||||||
# Initialize ccxt objects
|
# Initialize ccxt objects
|
||||||
self._api = self._init_ccxt(
|
self._api = self._init_ccxt(
|
||||||
exchange_config, ccxt_kwargs=exchange_config.get('ccxt_config'))
|
exchange_config, ccxt_kwargs=exchange_config.get('ccxt_config'))
|
||||||
|
@ -760,6 +773,154 @@ class Exchange:
|
||||||
except ccxt.BaseError as e:
|
except ccxt.BaseError as e:
|
||||||
raise OperationalException(f'Could not fetch ticker data. Msg: {e}') from e
|
raise OperationalException(f'Could not fetch ticker data. Msg: {e}') from e
|
||||||
|
|
||||||
|
@retrier_async
|
||||||
|
async def _async_fetch_trades(self, pair: str,
|
||||||
|
since: Optional[int] = None,
|
||||||
|
params: Optional[dict] = None) -> List[Dict]:
|
||||||
|
"""
|
||||||
|
Asyncronously gets trade history using fetch_trades.
|
||||||
|
Handles exchange errors, does one call to the exchange.
|
||||||
|
:param pair: Pair to fetch trade data for
|
||||||
|
:param since: Since as integer timestamp in milliseconds
|
||||||
|
returns: List of dicts containing trades
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# fetch trades asynchronously
|
||||||
|
if params:
|
||||||
|
logger.debug("Fetching trades for pair %s, params: %s ", pair, params)
|
||||||
|
trades = await self._api_async.fetch_trades(pair, params=params, limit=1000)
|
||||||
|
else:
|
||||||
|
logger.debug(
|
||||||
|
"Fetching trades for pair %s, since %s %s...",
|
||||||
|
pair, since,
|
||||||
|
'(' + arrow.get(since // 1000).isoformat() + ') ' if since is not None else ''
|
||||||
|
)
|
||||||
|
trades = await self._api_async.fetch_trades(pair, since=since, limit=1000)
|
||||||
|
return trades
|
||||||
|
except ccxt.NotSupported as e:
|
||||||
|
raise OperationalException(
|
||||||
|
f'Exchange {self._api.name} does not support fetching historical trade data.'
|
||||||
|
f'Message: {e}') from e
|
||||||
|
except (ccxt.NetworkError, ccxt.ExchangeError) as e:
|
||||||
|
raise TemporaryError(f'Could not load trade history due to {e.__class__.__name__}. '
|
||||||
|
f'Message: {e}') from e
|
||||||
|
except ccxt.BaseError as e:
|
||||||
|
raise OperationalException(f'Could not fetch trade data. Msg: {e}') from e
|
||||||
|
|
||||||
|
async def _async_get_trade_history_id(self, pair: str,
|
||||||
|
until: int,
|
||||||
|
since: Optional[int] = None,
|
||||||
|
from_id: Optional[str] = None) -> Tuple[str, List[Dict]]:
|
||||||
|
"""
|
||||||
|
Asyncronously gets trade history using fetch_trades
|
||||||
|
use this when exchange uses id-based iteration (check `self._trades_pagination`)
|
||||||
|
:param pair: Pair to fetch trade data for
|
||||||
|
:param since: Since as integer timestamp in milliseconds
|
||||||
|
:param until: Until as integer timestamp in milliseconds
|
||||||
|
:param from_id: Download data starting with ID (if id is known). Ignores "since" if set.
|
||||||
|
returns tuple: (pair, trades-list)
|
||||||
|
"""
|
||||||
|
|
||||||
|
trades: List[Dict] = []
|
||||||
|
|
||||||
|
if not from_id:
|
||||||
|
# Fetch first elements using timebased method to get an ID to paginate on
|
||||||
|
# Depending on the Exchange, this can introduce a drift at the start of the interval
|
||||||
|
# of up to an hour.
|
||||||
|
# e.g. Binance returns the "last 1000" candles within a 1h time interval
|
||||||
|
# - so we will miss the first trades.
|
||||||
|
t = await self._async_fetch_trades(pair, since=since)
|
||||||
|
from_id = t[-1]['id']
|
||||||
|
trades.extend(t[:-1])
|
||||||
|
while True:
|
||||||
|
t = await self._async_fetch_trades(pair,
|
||||||
|
params={self._trades_pagination_arg: from_id})
|
||||||
|
if len(t):
|
||||||
|
# Skip last id since its the key for the next call
|
||||||
|
trades.extend(t[:-1])
|
||||||
|
if from_id == t[-1]['id'] or t[-1]['timestamp'] > until:
|
||||||
|
logger.debug(f"Stopping because from_id did not change. "
|
||||||
|
f"Reached {t[-1]['timestamp']} > {until}")
|
||||||
|
# Reached the end of the defined-download period - add last trade as well.
|
||||||
|
trades.extend(t[-1:])
|
||||||
|
break
|
||||||
|
|
||||||
|
from_id = t[-1]['id']
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
return (pair, trades)
|
||||||
|
|
||||||
|
async def _async_get_trade_history_time(self, pair: str, until: int,
|
||||||
|
since: Optional[int] = None) -> Tuple[str, List]:
|
||||||
|
"""
|
||||||
|
Asyncronously gets trade history using fetch_trades,
|
||||||
|
when the exchange uses time-based iteration (check `self._trades_pagination`)
|
||||||
|
:param pair: Pair to fetch trade data for
|
||||||
|
:param since: Since as integer timestamp in milliseconds
|
||||||
|
:param until: Until as integer timestamp in milliseconds
|
||||||
|
returns tuple: (pair, trades-list)
|
||||||
|
"""
|
||||||
|
|
||||||
|
trades: List[Dict] = []
|
||||||
|
while True:
|
||||||
|
t = await self._async_fetch_trades(pair, since=since)
|
||||||
|
if len(t):
|
||||||
|
since = t[-1]['timestamp']
|
||||||
|
trades.extend(t)
|
||||||
|
# Reached the end of the defined-download period
|
||||||
|
if until and t[-1]['timestamp'] > until:
|
||||||
|
logger.debug(
|
||||||
|
f"Stopping because until was reached. {t[-1]['timestamp']} > {until}")
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
break
|
||||||
|
|
||||||
|
return (pair, trades)
|
||||||
|
|
||||||
|
async def _async_get_trade_history(self, pair: str,
|
||||||
|
since: Optional[int] = None,
|
||||||
|
until: Optional[int] = None,
|
||||||
|
from_id: Optional[str] = None) -> Tuple[str, List[Dict]]:
|
||||||
|
"""
|
||||||
|
Async wrapper handling downloading trades using either time or id based methods.
|
||||||
|
"""
|
||||||
|
|
||||||
|
if self._trades_pagination == 'time':
|
||||||
|
return await self._async_get_trade_history_time(
|
||||||
|
pair=pair, since=since,
|
||||||
|
until=until or ccxt.Exchange.milliseconds())
|
||||||
|
elif self._trades_pagination == 'id':
|
||||||
|
return await self._async_get_trade_history_id(
|
||||||
|
pair=pair, since=since,
|
||||||
|
until=until or ccxt.Exchange.milliseconds(), from_id=from_id
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
raise OperationalException(f"Exchange {self.name} does use neither time, "
|
||||||
|
f"nor id based pagination")
|
||||||
|
|
||||||
|
def get_historic_trades(self, pair: str,
|
||||||
|
since: Optional[int] = None,
|
||||||
|
until: Optional[int] = None,
|
||||||
|
from_id: Optional[str] = None) -> Tuple[str, List]:
|
||||||
|
"""
|
||||||
|
Gets candle history using asyncio and returns the list of candles.
|
||||||
|
Handles all async doing.
|
||||||
|
Async over one pair, assuming we get `_ohlcv_candle_limit` candles per call.
|
||||||
|
:param pair: Pair to download
|
||||||
|
:param ticker_interval: Interval to get
|
||||||
|
:param since: Timestamp in milliseconds to get history from
|
||||||
|
:param until: Timestamp in milliseconds. Defaults to current timestamp if not defined.
|
||||||
|
:param from_id: Download data starting with ID (if id is known)
|
||||||
|
:returns List of tickers
|
||||||
|
"""
|
||||||
|
if not self.exchange_has("fetchTrades"):
|
||||||
|
raise OperationalException("This exchange does not suport downloading Trades.")
|
||||||
|
|
||||||
|
return asyncio.get_event_loop().run_until_complete(
|
||||||
|
self._async_get_trade_history(pair=pair, since=since,
|
||||||
|
until=until, from_id=from_id))
|
||||||
|
|
||||||
@retrier
|
@retrier
|
||||||
def cancel_order(self, order_id: str, pair: str) -> None:
|
def cancel_order(self, order_id: str, pair: str) -> None:
|
||||||
if self._config['dry_run']:
|
if self._config['dry_run']:
|
||||||
|
|
|
@ -14,6 +14,10 @@ logger = logging.getLogger(__name__)
|
||||||
class Kraken(Exchange):
|
class Kraken(Exchange):
|
||||||
|
|
||||||
_params: Dict = {"trading_agreement": "agree"}
|
_params: Dict = {"trading_agreement": "agree"}
|
||||||
|
_ft_has: Dict = {
|
||||||
|
"trades_pagination": "id",
|
||||||
|
"trades_pagination_arg": "since",
|
||||||
|
}
|
||||||
|
|
||||||
@retrier
|
@retrier
|
||||||
def get_balances(self) -> dict:
|
def get_balances(self) -> dict:
|
||||||
|
|
|
@ -11,7 +11,7 @@ from typing import Any, Dict, List, Optional, Tuple
|
||||||
import arrow
|
import arrow
|
||||||
from requests.exceptions import RequestException
|
from requests.exceptions import RequestException
|
||||||
|
|
||||||
from freqtrade import (DependencyException, OperationalException, InvalidOrderException,
|
from freqtrade import (DependencyException, InvalidOrderException,
|
||||||
__version__, constants, persistence)
|
__version__, constants, persistence)
|
||||||
from freqtrade.data.converter import order_book_to_dataframe
|
from freqtrade.data.converter import order_book_to_dataframe
|
||||||
from freqtrade.data.dataprovider import DataProvider
|
from freqtrade.data.dataprovider import DataProvider
|
||||||
|
@ -466,11 +466,12 @@ class FreqtradeBot:
|
||||||
if result:
|
if result:
|
||||||
self.wallets.update()
|
self.wallets.update()
|
||||||
|
|
||||||
def get_real_amount(self, trade: Trade, order: Dict) -> float:
|
def get_real_amount(self, trade: Trade, order: Dict, order_amount: float = None) -> float:
|
||||||
"""
|
"""
|
||||||
Get real amount for the trade
|
Get real amount for the trade
|
||||||
Necessary for exchanges which charge fees in base currency (e.g. binance)
|
Necessary for exchanges which charge fees in base currency (e.g. binance)
|
||||||
"""
|
"""
|
||||||
|
if order_amount is None:
|
||||||
order_amount = order['amount']
|
order_amount = order['amount']
|
||||||
# Only run for closed orders
|
# Only run for closed orders
|
||||||
if trade.fee_open == 0 or order['status'] == 'open':
|
if trade.fee_open == 0 or order['status'] == 'open':
|
||||||
|
@ -508,7 +509,7 @@ class FreqtradeBot:
|
||||||
|
|
||||||
if not isclose(amount, order_amount, abs_tol=constants.MATH_CLOSE_PREC):
|
if not isclose(amount, order_amount, abs_tol=constants.MATH_CLOSE_PREC):
|
||||||
logger.warning(f"Amount {amount} does not match amount {trade.amount}")
|
logger.warning(f"Amount {amount} does not match amount {trade.amount}")
|
||||||
raise OperationalException("Half bought? Amounts don't match")
|
raise DependencyException("Half bought? Amounts don't match")
|
||||||
real_amount = amount - fee_abs
|
real_amount = amount - fee_abs
|
||||||
if fee_abs != 0:
|
if fee_abs != 0:
|
||||||
logger.info(f"Applying fee on amount for {trade} "
|
logger.info(f"Applying fee on amount for {trade} "
|
||||||
|
@ -536,7 +537,7 @@ class FreqtradeBot:
|
||||||
# Fee was applied, so set to 0
|
# Fee was applied, so set to 0
|
||||||
trade.fee_open = 0
|
trade.fee_open = 0
|
||||||
|
|
||||||
except OperationalException as exception:
|
except DependencyException as exception:
|
||||||
logger.warning("Could not update trade amount: %s", exception)
|
logger.warning("Could not update trade amount: %s", exception)
|
||||||
|
|
||||||
trade.update(order)
|
trade.update(order)
|
||||||
|
@ -705,7 +706,7 @@ class FreqtradeBot:
|
||||||
if trade.stop_loss > float(order['info']['stopPrice']):
|
if trade.stop_loss > float(order['info']['stopPrice']):
|
||||||
# we check if the update is neccesary
|
# we check if the update is neccesary
|
||||||
update_beat = self.strategy.order_types.get('stoploss_on_exchange_interval', 60)
|
update_beat = self.strategy.order_types.get('stoploss_on_exchange_interval', 60)
|
||||||
if (datetime.utcnow() - trade.stoploss_last_update).total_seconds() > update_beat:
|
if (datetime.utcnow() - trade.stoploss_last_update).total_seconds() >= update_beat:
|
||||||
# cancelling the current stoploss on exchange first
|
# cancelling the current stoploss on exchange first
|
||||||
logger.info('Trailing stoploss: cancelling current stoploss on exchange (id:{%s})'
|
logger.info('Trailing stoploss: cancelling current stoploss on exchange (id:{%s})'
|
||||||
'in order to add another one ...', order['id'])
|
'in order to add another one ...', order['id'])
|
||||||
|
@ -747,8 +748,8 @@ class FreqtradeBot:
|
||||||
"""
|
"""
|
||||||
buy_timeout = self.config['unfilledtimeout']['buy']
|
buy_timeout = self.config['unfilledtimeout']['buy']
|
||||||
sell_timeout = self.config['unfilledtimeout']['sell']
|
sell_timeout = self.config['unfilledtimeout']['sell']
|
||||||
buy_timeoutthreashold = arrow.utcnow().shift(minutes=-buy_timeout).datetime
|
buy_timeout_threshold = arrow.utcnow().shift(minutes=-buy_timeout).datetime
|
||||||
sell_timeoutthreashold = arrow.utcnow().shift(minutes=-sell_timeout).datetime
|
sell_timeout_threshold = arrow.utcnow().shift(minutes=-sell_timeout).datetime
|
||||||
|
|
||||||
for trade in Trade.query.filter(Trade.open_order_id.isnot(None)).all():
|
for trade in Trade.query.filter(Trade.open_order_id.isnot(None)).all():
|
||||||
try:
|
try:
|
||||||
|
@ -772,19 +773,16 @@ class FreqtradeBot:
|
||||||
self.wallets.update()
|
self.wallets.update()
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# Handle cancelled on exchange
|
if ((order['side'] == 'buy' and order['status'] == 'canceled')
|
||||||
if order['status'] == 'canceled':
|
or (order['status'] == 'open'
|
||||||
if order['side'] == 'buy':
|
and order['side'] == 'buy' and ordertime < buy_timeout_threshold)):
|
||||||
self.handle_buy_order_full_cancel(trade, "canceled on Exchange")
|
|
||||||
elif order['side'] == 'sell':
|
|
||||||
self.handle_timedout_limit_sell(trade, order)
|
|
||||||
self.wallets.update()
|
|
||||||
# Check if order is still actually open
|
|
||||||
elif order['status'] == 'open':
|
|
||||||
if order['side'] == 'buy' and ordertime < buy_timeoutthreashold:
|
|
||||||
self.handle_timedout_limit_buy(trade, order)
|
self.handle_timedout_limit_buy(trade, order)
|
||||||
self.wallets.update()
|
self.wallets.update()
|
||||||
elif order['side'] == 'sell' and ordertime < sell_timeoutthreashold:
|
|
||||||
|
elif ((order['side'] == 'sell' and order['status'] == 'canceled')
|
||||||
|
or (order['status'] == 'open'
|
||||||
|
and order['side'] == 'sell' and ordertime < sell_timeout_threshold)):
|
||||||
self.handle_timedout_limit_sell(trade, order)
|
self.handle_timedout_limit_sell(trade, order)
|
||||||
self.wallets.update()
|
self.wallets.update()
|
||||||
|
|
||||||
|
@ -802,16 +800,33 @@ class FreqtradeBot:
|
||||||
"""Buy timeout - cancel order
|
"""Buy timeout - cancel order
|
||||||
:return: True if order was fully cancelled
|
:return: True if order was fully cancelled
|
||||||
"""
|
"""
|
||||||
self.exchange.cancel_order(trade.open_order_id, trade.pair)
|
reason = "cancelled due to timeout"
|
||||||
if order['remaining'] == order['amount']:
|
if order['status'] != 'canceled':
|
||||||
|
corder = self.exchange.cancel_order(trade.open_order_id, trade.pair)
|
||||||
|
else:
|
||||||
|
# Order was cancelled already, so we can reuse the existing dict
|
||||||
|
corder = order
|
||||||
|
reason = "canceled on Exchange"
|
||||||
|
|
||||||
|
if corder['remaining'] == corder['amount']:
|
||||||
# if trade is not partially completed, just delete the trade
|
# if trade is not partially completed, just delete the trade
|
||||||
self.handle_buy_order_full_cancel(trade, "cancelled due to timeout")
|
self.handle_buy_order_full_cancel(trade, reason)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
# if trade is partially complete, edit the stake details for the trade
|
# if trade is partially complete, edit the stake details for the trade
|
||||||
# and close the order
|
# and close the order
|
||||||
trade.amount = order['amount'] - order['remaining']
|
trade.amount = corder['amount'] - corder['remaining']
|
||||||
trade.stake_amount = trade.amount * trade.open_rate
|
trade.stake_amount = trade.amount * trade.open_rate
|
||||||
|
# verify if fees were taken from amount to avoid problems during selling
|
||||||
|
try:
|
||||||
|
new_amount = self.get_real_amount(trade, corder, trade.amount)
|
||||||
|
if not isclose(order['amount'], new_amount, abs_tol=constants.MATH_CLOSE_PREC):
|
||||||
|
trade.amount = new_amount
|
||||||
|
# Fee was applied, so set to 0
|
||||||
|
trade.fee_open = 0
|
||||||
|
except DependencyException as e:
|
||||||
|
logger.warning("Could not update trade amount: %s", e)
|
||||||
|
|
||||||
trade.open_order_id = None
|
trade.open_order_id = None
|
||||||
logger.info('Partial buy order timeout for %s.', trade)
|
logger.info('Partial buy order timeout for %s.', trade)
|
||||||
self.rpc.send_msg({
|
self.rpc.send_msg({
|
||||||
|
|
|
@ -72,8 +72,10 @@ def json_load(datafile: IO):
|
||||||
|
|
||||||
def file_load_json(file):
|
def file_load_json(file):
|
||||||
|
|
||||||
|
if file.suffix != ".gz":
|
||||||
gzipfile = file.with_suffix(file.suffix + '.gz')
|
gzipfile = file.with_suffix(file.suffix + '.gz')
|
||||||
|
else:
|
||||||
|
gzipfile = file
|
||||||
# Try gzip file first, otherwise regular json file.
|
# Try gzip file first, otherwise regular json file.
|
||||||
if gzipfile.is_file():
|
if gzipfile.is_file():
|
||||||
logger.debug('Loading ticker data from file %s', gzipfile)
|
logger.debug('Loading ticker data from file %s', gzipfile)
|
||||||
|
|
|
@ -11,7 +11,7 @@ import freqtrade.vendor.qtpylib.indicators as qtpylib
|
||||||
from freqtrade.optimize.hyperopt_interface import IHyperOpt
|
from freqtrade.optimize.hyperopt_interface import IHyperOpt
|
||||||
|
|
||||||
|
|
||||||
class DefaultHyperOpts(IHyperOpt):
|
class DefaultHyperOpt(IHyperOpt):
|
||||||
"""
|
"""
|
||||||
Default hyperopt provided by the Freqtrade bot.
|
Default hyperopt provided by the Freqtrade bot.
|
||||||
You can override it with your own Hyperopt
|
You can override it with your own Hyperopt
|
||||||
|
|
|
@ -3,7 +3,7 @@ This module loads custom exchanges
|
||||||
"""
|
"""
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
from freqtrade.exchange import Exchange
|
from freqtrade.exchange import Exchange, MAP_EXCHANGE_CHILDCLASS
|
||||||
import freqtrade.exchange as exchanges
|
import freqtrade.exchange as exchanges
|
||||||
from freqtrade.resolvers import IResolver
|
from freqtrade.resolvers import IResolver
|
||||||
|
|
||||||
|
@ -22,6 +22,8 @@ class ExchangeResolver(IResolver):
|
||||||
Load the custom class from config parameter
|
Load the custom class from config parameter
|
||||||
:param config: configuration dictionary
|
:param config: configuration dictionary
|
||||||
"""
|
"""
|
||||||
|
# Map exchange name to avoid duplicate classes for identical exchanges
|
||||||
|
exchange_name = MAP_EXCHANGE_CHILDCLASS.get(exchange_name, exchange_name)
|
||||||
exchange_name = exchange_name.title()
|
exchange_name = exchange_name.title()
|
||||||
try:
|
try:
|
||||||
self.exchange = self._load_exchange(exchange_name, kwargs={'config': config})
|
self.exchange = self._load_exchange(exchange_name, kwargs={'config': config})
|
||||||
|
|
|
@ -52,14 +52,8 @@ class HyperOptResolver(IResolver):
|
||||||
"""
|
"""
|
||||||
current_path = Path(__file__).parent.parent.joinpath('optimize').resolve()
|
current_path = Path(__file__).parent.parent.joinpath('optimize').resolve()
|
||||||
|
|
||||||
abs_paths = [
|
abs_paths = self.build_search_paths(config, current_path=current_path,
|
||||||
config['user_data_dir'].joinpath('hyperopts'),
|
user_subdir='hyperopts', extra_dir=extra_dir)
|
||||||
current_path,
|
|
||||||
]
|
|
||||||
|
|
||||||
if extra_dir:
|
|
||||||
# Add extra hyperopt directory on top of search paths
|
|
||||||
abs_paths.insert(0, Path(extra_dir).resolve())
|
|
||||||
|
|
||||||
hyperopt = self._load_object(paths=abs_paths, object_type=IHyperOpt,
|
hyperopt = self._load_object(paths=abs_paths, object_type=IHyperOpt,
|
||||||
object_name=hyperopt_name, kwargs={'config': config})
|
object_name=hyperopt_name, kwargs={'config': config})
|
||||||
|
@ -109,14 +103,8 @@ class HyperOptLossResolver(IResolver):
|
||||||
"""
|
"""
|
||||||
current_path = Path(__file__).parent.parent.joinpath('optimize').resolve()
|
current_path = Path(__file__).parent.parent.joinpath('optimize').resolve()
|
||||||
|
|
||||||
abs_paths = [
|
abs_paths = self.build_search_paths(config, current_path=current_path,
|
||||||
config['user_data_dir'].joinpath('hyperopts'),
|
user_subdir='hyperopts', extra_dir=extra_dir)
|
||||||
current_path,
|
|
||||||
]
|
|
||||||
|
|
||||||
if extra_dir:
|
|
||||||
# Add extra hyperopt directory on top of search paths
|
|
||||||
abs_paths.insert(0, Path(extra_dir).resolve())
|
|
||||||
|
|
||||||
hyperoptloss = self._load_object(paths=abs_paths, object_type=IHyperOptLoss,
|
hyperoptloss = self._load_object(paths=abs_paths, object_type=IHyperOptLoss,
|
||||||
object_name=hyper_loss_name)
|
object_name=hyper_loss_name)
|
||||||
|
|
|
@ -7,7 +7,7 @@ import importlib.util
|
||||||
import inspect
|
import inspect
|
||||||
import logging
|
import logging
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any, List, Optional, Tuple, Type, Union
|
from typing import Any, List, Optional, Tuple, Union, Generator
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
@ -17,15 +17,29 @@ class IResolver:
|
||||||
This class contains all the logic to load custom classes
|
This class contains all the logic to load custom classes
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
def build_search_paths(self, config, current_path: Path, user_subdir: str,
|
||||||
|
extra_dir: Optional[str] = None) -> List[Path]:
|
||||||
|
|
||||||
|
abs_paths = [
|
||||||
|
config['user_data_dir'].joinpath(user_subdir),
|
||||||
|
current_path,
|
||||||
|
]
|
||||||
|
|
||||||
|
if extra_dir:
|
||||||
|
# Add extra directory to the top of the search paths
|
||||||
|
abs_paths.insert(0, Path(extra_dir).resolve())
|
||||||
|
|
||||||
|
return abs_paths
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _get_valid_object(object_type, module_path: Path,
|
def _get_valid_object(object_type, module_path: Path,
|
||||||
object_name: str) -> Optional[Type[Any]]:
|
object_name: str) -> Generator[Any, None, None]:
|
||||||
"""
|
"""
|
||||||
Returns the first object with matching object_type and object_name in the path given.
|
Generator returning objects with matching object_type and object_name in the path given.
|
||||||
:param object_type: object_type (class)
|
:param object_type: object_type (class)
|
||||||
:param module_path: absolute path to the module
|
:param module_path: absolute path to the module
|
||||||
:param object_name: Class name of the object
|
:param object_name: Class name of the object
|
||||||
:return: class or None
|
:return: generator containing matching objects
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Generate spec based on absolute path
|
# Generate spec based on absolute path
|
||||||
|
@ -42,7 +56,7 @@ class IResolver:
|
||||||
obj for name, obj in inspect.getmembers(module, inspect.isclass)
|
obj for name, obj in inspect.getmembers(module, inspect.isclass)
|
||||||
if object_name == name and object_type in obj.__bases__
|
if object_name == name and object_type in obj.__bases__
|
||||||
)
|
)
|
||||||
return next(valid_objects_gen, None)
|
return valid_objects_gen
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _search_object(directory: Path, object_type, object_name: str,
|
def _search_object(directory: Path, object_type, object_name: str,
|
||||||
|
@ -59,9 +73,9 @@ class IResolver:
|
||||||
logger.debug('Ignoring %s', entry)
|
logger.debug('Ignoring %s', entry)
|
||||||
continue
|
continue
|
||||||
module_path = entry.resolve()
|
module_path = entry.resolve()
|
||||||
obj = IResolver._get_valid_object(
|
|
||||||
object_type, module_path, object_name
|
obj = next(IResolver._get_valid_object(object_type, module_path, object_name), None)
|
||||||
)
|
|
||||||
if obj:
|
if obj:
|
||||||
return (obj(**kwargs), module_path)
|
return (obj(**kwargs), module_path)
|
||||||
return (None, None)
|
return (None, None)
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
# pragma pylint: disable=attribute-defined-outside-init
|
# pragma pylint: disable=attribute-defined-outside-init
|
||||||
|
|
||||||
"""
|
"""
|
||||||
This module load custom hyperopts
|
This module load custom pairlists
|
||||||
"""
|
"""
|
||||||
import logging
|
import logging
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
@ -15,7 +15,7 @@ logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
class PairListResolver(IResolver):
|
class PairListResolver(IResolver):
|
||||||
"""
|
"""
|
||||||
This class contains all the logic to load custom hyperopt class
|
This class contains all the logic to load custom PairList class
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__slots__ = ['pairlist']
|
__slots__ = ['pairlist']
|
||||||
|
@ -39,10 +39,8 @@ class PairListResolver(IResolver):
|
||||||
"""
|
"""
|
||||||
current_path = Path(__file__).parent.parent.joinpath('pairlist').resolve()
|
current_path = Path(__file__).parent.parent.joinpath('pairlist').resolve()
|
||||||
|
|
||||||
abs_paths = [
|
abs_paths = self.build_search_paths(config, current_path=current_path,
|
||||||
config['user_data_dir'].joinpath('pairlist'),
|
user_subdir='pairlist', extra_dir=None)
|
||||||
current_path,
|
|
||||||
]
|
|
||||||
|
|
||||||
pairlist = self._load_object(paths=abs_paths, object_type=IPairList,
|
pairlist = self._load_object(paths=abs_paths, object_type=IPairList,
|
||||||
object_name=pairlist_name, kwargs=kwargs)
|
object_name=pairlist_name, kwargs=kwargs)
|
||||||
|
|
|
@ -95,7 +95,10 @@ class StrategyResolver(IResolver):
|
||||||
logger.info("Override strategy '%s' with value in config file: %s.",
|
logger.info("Override strategy '%s' with value in config file: %s.",
|
||||||
attribute, config[attribute])
|
attribute, config[attribute])
|
||||||
elif hasattr(self.strategy, attribute):
|
elif hasattr(self.strategy, attribute):
|
||||||
config[attribute] = getattr(self.strategy, attribute)
|
val = getattr(self.strategy, attribute)
|
||||||
|
# None's cannot exist in the config, so do not copy them
|
||||||
|
if val is not None:
|
||||||
|
config[attribute] = val
|
||||||
# Explicitly check for None here as other "falsy" values are possible
|
# Explicitly check for None here as other "falsy" values are possible
|
||||||
elif default is not None:
|
elif default is not None:
|
||||||
setattr(self.strategy, attribute, default)
|
setattr(self.strategy, attribute, default)
|
||||||
|
@ -121,14 +124,8 @@ class StrategyResolver(IResolver):
|
||||||
"""
|
"""
|
||||||
current_path = Path(__file__).parent.parent.joinpath('strategy').resolve()
|
current_path = Path(__file__).parent.parent.joinpath('strategy').resolve()
|
||||||
|
|
||||||
abs_paths = [
|
abs_paths = self.build_search_paths(config, current_path=current_path,
|
||||||
config['user_data_dir'].joinpath('strategies'),
|
user_subdir='strategies', extra_dir=extra_dir)
|
||||||
current_path,
|
|
||||||
]
|
|
||||||
|
|
||||||
if extra_dir:
|
|
||||||
# Add extra strategy directory on top of search paths
|
|
||||||
abs_paths.insert(0, Path(extra_dir).resolve())
|
|
||||||
|
|
||||||
if ":" in strategy_name:
|
if ":" in strategy_name:
|
||||||
logger.info("loading base64 encoded strategy")
|
logger.info("loading base64 encoded strategy")
|
||||||
|
|
|
@ -18,7 +18,7 @@ class RPCManager:
|
||||||
self.registered_modules: List[RPC] = []
|
self.registered_modules: List[RPC] = []
|
||||||
|
|
||||||
# Enable telegram
|
# Enable telegram
|
||||||
if freqtrade.config['telegram'].get('enabled', False):
|
if freqtrade.config.get('telegram', {}).get('enabled', False):
|
||||||
logger.info('Enabling rpc.telegram ...')
|
logger.info('Enabling rpc.telegram ...')
|
||||||
from freqtrade.rpc.telegram import Telegram
|
from freqtrade.rpc.telegram import Telegram
|
||||||
self.registered_modules.append(Telegram(freqtrade))
|
self.registered_modules.append(Telegram(freqtrade))
|
||||||
|
|
|
@ -78,8 +78,8 @@ class IStrategy(ABC):
|
||||||
|
|
||||||
# trailing stoploss
|
# trailing stoploss
|
||||||
trailing_stop: bool = False
|
trailing_stop: bool = False
|
||||||
trailing_stop_positive: float
|
trailing_stop_positive: Optional[float] = None
|
||||||
trailing_stop_positive_offset: float
|
trailing_stop_positive_offset: float = 0.0
|
||||||
trailing_only_offset_is_reached = False
|
trailing_only_offset_is_reached = False
|
||||||
|
|
||||||
# associated ticker interval
|
# associated ticker interval
|
||||||
|
@ -347,26 +347,23 @@ class IStrategy(ABC):
|
||||||
decides to sell or not
|
decides to sell or not
|
||||||
:param current_profit: current profit in percent
|
:param current_profit: current profit in percent
|
||||||
"""
|
"""
|
||||||
trailing_stop = self.config.get('trailing_stop', False)
|
|
||||||
stop_loss_value = force_stoploss if force_stoploss else self.stoploss
|
stop_loss_value = force_stoploss if force_stoploss else self.stoploss
|
||||||
|
|
||||||
# Initiate stoploss with open_rate. Does nothing if stoploss is already set.
|
# Initiate stoploss with open_rate. Does nothing if stoploss is already set.
|
||||||
trade.adjust_stop_loss(trade.open_rate, stop_loss_value, initial=True)
|
trade.adjust_stop_loss(trade.open_rate, stop_loss_value, initial=True)
|
||||||
|
|
||||||
if trailing_stop:
|
if self.trailing_stop:
|
||||||
# trailing stoploss handling
|
# trailing stoploss handling
|
||||||
sl_offset = self.config.get('trailing_stop_positive_offset') or 0.0
|
sl_offset = self.trailing_stop_positive_offset
|
||||||
tsl_only_offset = self.config.get('trailing_only_offset_is_reached', False)
|
|
||||||
|
|
||||||
# Make sure current_profit is calculated using high for backtesting.
|
# Make sure current_profit is calculated using high for backtesting.
|
||||||
high_profit = current_profit if not high else trade.calc_profit_percent(high)
|
high_profit = current_profit if not high else trade.calc_profit_percent(high)
|
||||||
|
|
||||||
# Don't update stoploss if trailing_only_offset_is_reached is true.
|
# Don't update stoploss if trailing_only_offset_is_reached is true.
|
||||||
if not (tsl_only_offset and high_profit < sl_offset):
|
if not (self.trailing_only_offset_is_reached and high_profit < sl_offset):
|
||||||
# Specific handling for trailing_stop_positive
|
# Specific handling for trailing_stop_positive
|
||||||
if 'trailing_stop_positive' in self.config and high_profit > sl_offset:
|
if self.trailing_stop_positive is not None and high_profit > sl_offset:
|
||||||
# Ignore mypy error check in configuration that this is a float
|
stop_loss_value = self.trailing_stop_positive
|
||||||
stop_loss_value = self.config.get('trailing_stop_positive') # type: ignore
|
|
||||||
logger.debug(f"{trade.pair} - Using positive stoploss: {stop_loss_value} "
|
logger.debug(f"{trade.pair} - Using positive stoploss: {stop_loss_value} "
|
||||||
f"offset: {sl_offset:.4g} profit: {current_profit:.4f}%")
|
f"offset: {sl_offset:.4g} profit: {current_profit:.4f}%")
|
||||||
|
|
||||||
|
|
|
@ -12,7 +12,9 @@ from tabulate import tabulate
|
||||||
from freqtrade import OperationalException
|
from freqtrade import OperationalException
|
||||||
from freqtrade.configuration import Configuration, TimeRange
|
from freqtrade.configuration import Configuration, TimeRange
|
||||||
from freqtrade.configuration.directory_operations import create_userdata_dir
|
from freqtrade.configuration.directory_operations import create_userdata_dir
|
||||||
from freqtrade.data.history import refresh_backtest_ohlcv_data
|
from freqtrade.data.history import (convert_trades_to_ohlcv,
|
||||||
|
refresh_backtest_ohlcv_data,
|
||||||
|
refresh_backtest_trades_data)
|
||||||
from freqtrade.exchange import (available_exchanges, ccxt_exchanges, market_is_active,
|
from freqtrade.exchange import (available_exchanges, ccxt_exchanges, market_is_active,
|
||||||
symbol_is_pair)
|
symbol_is_pair)
|
||||||
from freqtrade.misc import plural
|
from freqtrade.misc import plural
|
||||||
|
@ -94,6 +96,16 @@ def start_download_data(args: Dict[str, Any]) -> None:
|
||||||
# Init exchange
|
# Init exchange
|
||||||
exchange = ExchangeResolver(config['exchange']['name'], config).exchange
|
exchange = ExchangeResolver(config['exchange']['name'], config).exchange
|
||||||
|
|
||||||
|
if config.get('download_trades'):
|
||||||
|
pairs_not_available = refresh_backtest_trades_data(
|
||||||
|
exchange, pairs=config["pairs"], datadir=Path(config['datadir']),
|
||||||
|
timerange=timerange, erase=config.get("erase"))
|
||||||
|
|
||||||
|
# Convert downloaded trade data to different timeframes
|
||||||
|
convert_trades_to_ohlcv(
|
||||||
|
pairs=config["pairs"], timeframes=config["timeframes"],
|
||||||
|
datadir=Path(config['datadir']), timerange=timerange, erase=config.get("erase"))
|
||||||
|
else:
|
||||||
pairs_not_available = refresh_backtest_ohlcv_data(
|
pairs_not_available = refresh_backtest_ohlcv_data(
|
||||||
exchange, pairs=config["pairs"], timeframes=config["timeframes"],
|
exchange, pairs=config["pairs"], timeframes=config["timeframes"],
|
||||||
dl_path=Path(config['datadir']), timerange=timerange, erase=config.get("erase"))
|
dl_path=Path(config['datadir']), timerange=timerange, erase=config.get("erase"))
|
||||||
|
|
|
@ -52,3 +52,4 @@ markdown_extensions:
|
||||||
- pymdownx.tasklist:
|
- pymdownx.tasklist:
|
||||||
custom_checkbox: true
|
custom_checkbox: true
|
||||||
- pymdownx.tilde
|
- pymdownx.tilde
|
||||||
|
- mdx_truly_sane_lists
|
||||||
|
|
|
@ -1,14 +1,14 @@
|
||||||
# requirements without requirements installable via conda
|
# requirements without requirements installable via conda
|
||||||
# mainly used for Raspberry pi installs
|
# mainly used for Raspberry pi installs
|
||||||
ccxt==1.18.1225
|
ccxt==1.18.1260
|
||||||
SQLAlchemy==1.3.9
|
SQLAlchemy==1.3.10
|
||||||
python-telegram-bot==12.1.1
|
python-telegram-bot==12.1.1
|
||||||
arrow==0.15.2
|
arrow==0.15.2
|
||||||
cachetools==3.1.1
|
cachetools==3.1.1
|
||||||
requests==2.22.0
|
requests==2.22.0
|
||||||
urllib3==1.25.6
|
urllib3==1.25.6
|
||||||
wrapt==1.11.2
|
wrapt==1.11.2
|
||||||
jsonschema==3.0.2
|
jsonschema==3.1.1
|
||||||
TA-Lib==0.4.17
|
TA-Lib==0.4.17
|
||||||
tabulate==0.8.5
|
tabulate==0.8.5
|
||||||
coinmarketcap==5.0.3
|
coinmarketcap==5.0.3
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
# Include all requirements to run the bot.
|
# Include all requirements to run the bot.
|
||||||
# -r requirements.txt
|
-r requirements.txt
|
||||||
|
|
||||||
# Required for hyperopt
|
# Required for hyperopt
|
||||||
scipy==1.3.1
|
scipy==1.3.1
|
||||||
|
|
|
@ -9,8 +9,8 @@ from pathlib import Path
|
||||||
from unittest.mock import MagicMock, PropertyMock
|
from unittest.mock import MagicMock, PropertyMock
|
||||||
|
|
||||||
import arrow
|
import arrow
|
||||||
import pytest
|
|
||||||
import numpy as np
|
import numpy as np
|
||||||
|
import pytest
|
||||||
from telegram import Chat, Message, Update
|
from telegram import Chat, Message, Update
|
||||||
|
|
||||||
from freqtrade import constants, persistence
|
from freqtrade import constants, persistence
|
||||||
|
@ -19,10 +19,10 @@ from freqtrade.data.converter import parse_ticker_dataframe
|
||||||
from freqtrade.edge import Edge, PairInfo
|
from freqtrade.edge import Edge, PairInfo
|
||||||
from freqtrade.exchange import Exchange
|
from freqtrade.exchange import Exchange
|
||||||
from freqtrade.freqtradebot import FreqtradeBot
|
from freqtrade.freqtradebot import FreqtradeBot
|
||||||
|
from freqtrade.persistence import Trade
|
||||||
from freqtrade.resolvers import ExchangeResolver
|
from freqtrade.resolvers import ExchangeResolver
|
||||||
from freqtrade.worker import Worker
|
from freqtrade.worker import Worker
|
||||||
|
|
||||||
|
|
||||||
logging.getLogger('').setLevel(logging.INFO)
|
logging.getLogger('').setLevel(logging.INFO)
|
||||||
|
|
||||||
|
|
||||||
|
@ -653,6 +653,14 @@ def limit_buy_order_old_partial():
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def limit_buy_order_old_partial_canceled(limit_buy_order_old_partial):
|
||||||
|
res = deepcopy(limit_buy_order_old_partial)
|
||||||
|
res['status'] = 'canceled'
|
||||||
|
res['fee'] = {'cost': 0.0001, 'currency': 'ETH'}
|
||||||
|
return res
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def limit_sell_order():
|
def limit_sell_order():
|
||||||
return {
|
return {
|
||||||
|
@ -941,12 +949,6 @@ def result(testdatadir):
|
||||||
return parse_ticker_dataframe(json.load(data_file), '1m', pair="UNITTEST/BTC",
|
return parse_ticker_dataframe(json.load(data_file), '1m', pair="UNITTEST/BTC",
|
||||||
fill_missing=True)
|
fill_missing=True)
|
||||||
|
|
||||||
# FIX:
|
|
||||||
# Create an fixture/function
|
|
||||||
# that inserts a trade of some type and open-status
|
|
||||||
# return the open-order-id
|
|
||||||
# See tests in rpc/main that could use this
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def trades_for_order():
|
def trades_for_order():
|
||||||
|
@ -973,6 +975,110 @@ def trades_for_order():
|
||||||
'fee': {'cost': 0.008, 'currency': 'LTC'}}]
|
'fee': {'cost': 0.008, 'currency': 'LTC'}}]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def trades_history():
|
||||||
|
return [{'info': {'a': 126181329,
|
||||||
|
'p': '0.01962700',
|
||||||
|
'q': '0.04000000',
|
||||||
|
'f': 138604155,
|
||||||
|
'l': 138604155,
|
||||||
|
'T': 1565798399463,
|
||||||
|
'm': False,
|
||||||
|
'M': True},
|
||||||
|
'timestamp': 1565798399463,
|
||||||
|
'datetime': '2019-08-14T15:59:59.463Z',
|
||||||
|
'symbol': 'ETH/BTC',
|
||||||
|
'id': '126181329',
|
||||||
|
'order': None,
|
||||||
|
'type': None,
|
||||||
|
'takerOrMaker': None,
|
||||||
|
'side': 'buy',
|
||||||
|
'price': 0.019627,
|
||||||
|
'amount': 0.04,
|
||||||
|
'cost': 0.00078508,
|
||||||
|
'fee': None},
|
||||||
|
{'info': {'a': 126181330,
|
||||||
|
'p': '0.01962700',
|
||||||
|
'q': '0.24400000',
|
||||||
|
'f': 138604156,
|
||||||
|
'l': 138604156,
|
||||||
|
'T': 1565798399629,
|
||||||
|
'm': False,
|
||||||
|
'M': True},
|
||||||
|
'timestamp': 1565798399629,
|
||||||
|
'datetime': '2019-08-14T15:59:59.629Z',
|
||||||
|
'symbol': 'ETH/BTC',
|
||||||
|
'id': '126181330',
|
||||||
|
'order': None,
|
||||||
|
'type': None,
|
||||||
|
'takerOrMaker': None,
|
||||||
|
'side': 'buy',
|
||||||
|
'price': 0.019627,
|
||||||
|
'amount': 0.244,
|
||||||
|
'cost': 0.004788987999999999,
|
||||||
|
'fee': None},
|
||||||
|
{'info': {'a': 126181331,
|
||||||
|
'p': '0.01962600',
|
||||||
|
'q': '0.01100000',
|
||||||
|
'f': 138604157,
|
||||||
|
'l': 138604157,
|
||||||
|
'T': 1565798399752,
|
||||||
|
'm': True,
|
||||||
|
'M': True},
|
||||||
|
'timestamp': 1565798399752,
|
||||||
|
'datetime': '2019-08-14T15:59:59.752Z',
|
||||||
|
'symbol': 'ETH/BTC',
|
||||||
|
'id': '126181331',
|
||||||
|
'order': None,
|
||||||
|
'type': None,
|
||||||
|
'takerOrMaker': None,
|
||||||
|
'side': 'sell',
|
||||||
|
'price': 0.019626,
|
||||||
|
'amount': 0.011,
|
||||||
|
'cost': 0.00021588599999999999,
|
||||||
|
'fee': None},
|
||||||
|
{'info': {'a': 126181332,
|
||||||
|
'p': '0.01962600',
|
||||||
|
'q': '0.01100000',
|
||||||
|
'f': 138604158,
|
||||||
|
'l': 138604158,
|
||||||
|
'T': 1565798399862,
|
||||||
|
'm': True,
|
||||||
|
'M': True},
|
||||||
|
'timestamp': 1565798399862,
|
||||||
|
'datetime': '2019-08-14T15:59:59.862Z',
|
||||||
|
'symbol': 'ETH/BTC',
|
||||||
|
'id': '126181332',
|
||||||
|
'order': None,
|
||||||
|
'type': None,
|
||||||
|
'takerOrMaker': None,
|
||||||
|
'side': 'sell',
|
||||||
|
'price': 0.019626,
|
||||||
|
'amount': 0.011,
|
||||||
|
'cost': 0.00021588599999999999,
|
||||||
|
'fee': None},
|
||||||
|
{'info': {'a': 126181333,
|
||||||
|
'p': '0.01952600',
|
||||||
|
'q': '0.01200000',
|
||||||
|
'f': 138604158,
|
||||||
|
'l': 138604158,
|
||||||
|
'T': 1565798399872,
|
||||||
|
'm': True,
|
||||||
|
'M': True},
|
||||||
|
'timestamp': 1565798399872,
|
||||||
|
'datetime': '2019-08-14T15:59:59.872Z',
|
||||||
|
'symbol': 'ETH/BTC',
|
||||||
|
'id': '126181333',
|
||||||
|
'order': None,
|
||||||
|
'type': None,
|
||||||
|
'takerOrMaker': None,
|
||||||
|
'side': 'sell',
|
||||||
|
'price': 0.019626,
|
||||||
|
'amount': 0.011,
|
||||||
|
'cost': 0.00021588599999999999,
|
||||||
|
'fee': None}]
|
||||||
|
|
||||||
|
|
||||||
@pytest.fixture(scope="function")
|
@pytest.fixture(scope="function")
|
||||||
def trades_for_order2():
|
def trades_for_order2():
|
||||||
return [{'info': {'id': 34567,
|
return [{'info': {'id': 34567,
|
||||||
|
@ -1120,3 +1226,19 @@ def import_fails() -> None:
|
||||||
|
|
||||||
# restore previous importfunction
|
# restore previous importfunction
|
||||||
builtins.__import__ = realimport
|
builtins.__import__ = realimport
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope="function")
|
||||||
|
def open_trade():
|
||||||
|
return Trade(
|
||||||
|
pair='ETH/BTC',
|
||||||
|
open_rate=0.00001099,
|
||||||
|
exchange='bittrex',
|
||||||
|
open_order_id='123456789',
|
||||||
|
amount=90.99181073,
|
||||||
|
fee_open=0.0,
|
||||||
|
fee_close=0.0,
|
||||||
|
stake_amount=1,
|
||||||
|
open_date=arrow.utcnow().shift(minutes=-601).datetime,
|
||||||
|
is_open=True
|
||||||
|
)
|
||||||
|
|
|
@ -53,12 +53,12 @@ def test_load_trades_db(default_conf, fee, mocker):
|
||||||
|
|
||||||
def test_extract_trades_of_period(testdatadir):
|
def test_extract_trades_of_period(testdatadir):
|
||||||
pair = "UNITTEST/BTC"
|
pair = "UNITTEST/BTC"
|
||||||
timerange = TimeRange(None, 'line', 0, -1000)
|
# 2018-11-14 06:07:00
|
||||||
|
timerange = TimeRange('date', None, 1510639620, 0)
|
||||||
|
|
||||||
data = load_pair_history(pair=pair, ticker_interval='1m',
|
data = load_pair_history(pair=pair, ticker_interval='1m',
|
||||||
datadir=testdatadir, timerange=timerange)
|
datadir=testdatadir, timerange=timerange)
|
||||||
|
|
||||||
# timerange = 2017-11-14 06:07 - 2017-11-14 22:58:00
|
|
||||||
trades = DataFrame(
|
trades = DataFrame(
|
||||||
{'pair': [pair, pair, pair, pair],
|
{'pair': [pair, pair, pair, pair],
|
||||||
'profit_percent': [0.0, 0.1, -0.2, -0.5],
|
'profit_percent': [0.0, 0.1, -0.2, -0.5],
|
||||||
|
|
|
@ -13,15 +13,20 @@ from pandas import DataFrame
|
||||||
from freqtrade import OperationalException
|
from freqtrade import OperationalException
|
||||||
from freqtrade.configuration import TimeRange
|
from freqtrade.configuration import TimeRange
|
||||||
from freqtrade.data import history
|
from freqtrade.data import history
|
||||||
from freqtrade.data.history import (download_pair_history,
|
from freqtrade.data.history import (_load_cached_data_for_updating,
|
||||||
_load_cached_data_for_updating,
|
convert_trades_to_ohlcv,
|
||||||
load_tickerdata_file,
|
download_pair_history,
|
||||||
|
download_trades_history,
|
||||||
|
load_tickerdata_file, pair_data_filename,
|
||||||
|
pair_trades_filename,
|
||||||
refresh_backtest_ohlcv_data,
|
refresh_backtest_ohlcv_data,
|
||||||
|
refresh_backtest_trades_data,
|
||||||
trim_tickerlist)
|
trim_tickerlist)
|
||||||
from freqtrade.exchange import timeframe_to_minutes
|
from freqtrade.exchange import timeframe_to_minutes
|
||||||
from freqtrade.misc import file_dump_json
|
from freqtrade.misc import file_dump_json
|
||||||
from freqtrade.strategy.default_strategy import DefaultStrategy
|
from freqtrade.strategy.default_strategy import DefaultStrategy
|
||||||
from tests.conftest import get_patched_exchange, log_has, log_has_re, patch_exchange
|
from tests.conftest import (get_patched_exchange, log_has, log_has_re,
|
||||||
|
patch_exchange)
|
||||||
|
|
||||||
# Change this if modifying UNITTEST/BTC testdatafile
|
# Change this if modifying UNITTEST/BTC testdatafile
|
||||||
_BTC_UNITTEST_LENGTH = 13681
|
_BTC_UNITTEST_LENGTH = 13681
|
||||||
|
@ -134,6 +139,18 @@ def test_testdata_path(testdatadir) -> None:
|
||||||
assert str(Path('tests') / 'testdata') in str(testdatadir)
|
assert str(Path('tests') / 'testdata') in str(testdatadir)
|
||||||
|
|
||||||
|
|
||||||
|
def test_pair_data_filename():
|
||||||
|
fn = pair_data_filename(Path('freqtrade/hello/world'), 'ETH/BTC', '5m')
|
||||||
|
assert isinstance(fn, Path)
|
||||||
|
assert fn == Path('freqtrade/hello/world/ETH_BTC-5m.json')
|
||||||
|
|
||||||
|
|
||||||
|
def test_pair_trades_filename():
|
||||||
|
fn = pair_trades_filename(Path('freqtrade/hello/world'), 'ETH/BTC')
|
||||||
|
assert isinstance(fn, Path)
|
||||||
|
assert fn == Path('freqtrade/hello/world/ETH_BTC-trades.json.gz')
|
||||||
|
|
||||||
|
|
||||||
def test_load_cached_data_for_updating(mocker) -> None:
|
def test_load_cached_data_for_updating(mocker) -> None:
|
||||||
datadir = Path(__file__).parent.parent.joinpath('testdata')
|
datadir = Path(__file__).parent.parent.joinpath('testdata')
|
||||||
|
|
||||||
|
@ -364,37 +381,6 @@ def test_trim_tickerlist(testdatadir) -> None:
|
||||||
ticker_list = json.load(data_file)
|
ticker_list = json.load(data_file)
|
||||||
ticker_list_len = len(ticker_list)
|
ticker_list_len = len(ticker_list)
|
||||||
|
|
||||||
# Test the pattern ^(-\d+)$
|
|
||||||
# This pattern uses the latest N elements
|
|
||||||
timerange = TimeRange(None, 'line', 0, -5)
|
|
||||||
ticker = trim_tickerlist(ticker_list, timerange)
|
|
||||||
ticker_len = len(ticker)
|
|
||||||
|
|
||||||
assert ticker_len == 5
|
|
||||||
assert ticker_list[0] is not ticker[0] # The first element should be different
|
|
||||||
assert ticker_list[-1] is ticker[-1] # The last element must be the same
|
|
||||||
|
|
||||||
# Test the pattern ^(\d+)-$
|
|
||||||
# This pattern keep X element from the end
|
|
||||||
timerange = TimeRange('line', None, 5, 0)
|
|
||||||
ticker = trim_tickerlist(ticker_list, timerange)
|
|
||||||
ticker_len = len(ticker)
|
|
||||||
|
|
||||||
assert ticker_len == 5
|
|
||||||
assert ticker_list[0] is ticker[0] # The first element must be the same
|
|
||||||
assert ticker_list[-1] is not ticker[-1] # The last element should be different
|
|
||||||
|
|
||||||
# Test the pattern ^(\d+)-(\d+)$
|
|
||||||
# This pattern extract a window
|
|
||||||
timerange = TimeRange('index', 'index', 5, 10)
|
|
||||||
ticker = trim_tickerlist(ticker_list, timerange)
|
|
||||||
ticker_len = len(ticker)
|
|
||||||
|
|
||||||
assert ticker_len == 5
|
|
||||||
assert ticker_list[0] is not ticker[0] # The first element should be different
|
|
||||||
assert ticker_list[5] is ticker[0] # The list starts at the index 5
|
|
||||||
assert ticker_list[9] is ticker[-1] # The list ends at the index 9 (5 elements)
|
|
||||||
|
|
||||||
# Test the pattern ^(\d{8})-(\d{8})$
|
# Test the pattern ^(\d{8})-(\d{8})$
|
||||||
# This pattern extract a window between the dates
|
# This pattern extract a window between the dates
|
||||||
timerange = TimeRange('date', 'date', ticker_list[5][0] / 1000, ticker_list[10][0] / 1000 - 1)
|
timerange = TimeRange('date', 'date', ticker_list[5][0] / 1000, ticker_list[10][0] / 1000 - 1)
|
||||||
|
@ -434,13 +420,6 @@ def test_trim_tickerlist(testdatadir) -> None:
|
||||||
|
|
||||||
assert ticker_list_len == ticker_len
|
assert ticker_list_len == ticker_len
|
||||||
|
|
||||||
# Test invalid timerange (start after stop)
|
|
||||||
timerange = TimeRange('index', 'index', 10, 5)
|
|
||||||
with pytest.raises(ValueError, match=r'The timerange .* is incorrect'):
|
|
||||||
trim_tickerlist(ticker_list, timerange)
|
|
||||||
|
|
||||||
assert ticker_list_len == ticker_len
|
|
||||||
|
|
||||||
# passing empty list
|
# passing empty list
|
||||||
timerange = TimeRange(None, None, None, 5)
|
timerange = TimeRange(None, None, None, 5)
|
||||||
ticker = trim_tickerlist([], timerange)
|
ticker = trim_tickerlist([], timerange)
|
||||||
|
@ -569,3 +548,92 @@ def test_download_data_no_markets(mocker, default_conf, caplog, testdatadir):
|
||||||
assert "ETH/BTC" in unav_pairs
|
assert "ETH/BTC" in unav_pairs
|
||||||
assert "XRP/BTC" in unav_pairs
|
assert "XRP/BTC" in unav_pairs
|
||||||
assert log_has("Skipping pair ETH/BTC...", caplog)
|
assert log_has("Skipping pair ETH/BTC...", caplog)
|
||||||
|
|
||||||
|
|
||||||
|
def test_refresh_backtest_trades_data(mocker, default_conf, markets, caplog, testdatadir):
|
||||||
|
dl_mock = mocker.patch('freqtrade.data.history.download_trades_history', MagicMock())
|
||||||
|
mocker.patch(
|
||||||
|
'freqtrade.exchange.Exchange.markets', PropertyMock(return_value=markets)
|
||||||
|
)
|
||||||
|
mocker.patch.object(Path, "exists", MagicMock(return_value=True))
|
||||||
|
mocker.patch.object(Path, "unlink", MagicMock())
|
||||||
|
|
||||||
|
ex = get_patched_exchange(mocker, default_conf)
|
||||||
|
timerange = TimeRange.parse_timerange("20190101-20190102")
|
||||||
|
unavailable_pairs = refresh_backtest_trades_data(exchange=ex,
|
||||||
|
pairs=["ETH/BTC", "XRP/BTC", "XRP/ETH"],
|
||||||
|
datadir=testdatadir,
|
||||||
|
timerange=timerange, erase=True
|
||||||
|
)
|
||||||
|
|
||||||
|
assert dl_mock.call_count == 2
|
||||||
|
assert dl_mock.call_args[1]['timerange'].starttype == 'date'
|
||||||
|
|
||||||
|
assert log_has("Downloading trades for pair ETH/BTC.", caplog)
|
||||||
|
assert unavailable_pairs == ["XRP/ETH"]
|
||||||
|
assert log_has("Skipping pair XRP/ETH...", caplog)
|
||||||
|
|
||||||
|
|
||||||
|
def test_download_trades_history(trades_history, mocker, default_conf, testdatadir, caplog) -> None:
|
||||||
|
|
||||||
|
ght_mock = MagicMock(side_effect=lambda pair, *args, **kwargs: (pair, trades_history))
|
||||||
|
mocker.patch('freqtrade.exchange.Exchange.get_historic_trades',
|
||||||
|
ght_mock)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf)
|
||||||
|
file1 = testdatadir / 'ETH_BTC-trades.json.gz'
|
||||||
|
|
||||||
|
_backup_file(file1)
|
||||||
|
|
||||||
|
assert not file1.is_file()
|
||||||
|
|
||||||
|
assert download_trades_history(datadir=testdatadir, exchange=exchange,
|
||||||
|
pair='ETH/BTC')
|
||||||
|
assert log_has("New Amount of trades: 5", caplog)
|
||||||
|
assert file1.is_file()
|
||||||
|
|
||||||
|
# clean files freshly downloaded
|
||||||
|
_clean_test_file(file1)
|
||||||
|
|
||||||
|
mocker.patch('freqtrade.exchange.Exchange.get_historic_trades',
|
||||||
|
MagicMock(side_effect=ValueError))
|
||||||
|
|
||||||
|
assert not download_trades_history(datadir=testdatadir, exchange=exchange,
|
||||||
|
pair='ETH/BTC')
|
||||||
|
assert log_has_re('Failed to download historic trades for pair: "ETH/BTC".*', caplog)
|
||||||
|
|
||||||
|
|
||||||
|
def test_convert_trades_to_ohlcv(mocker, default_conf, testdatadir, caplog):
|
||||||
|
|
||||||
|
pair = 'XRP/ETH'
|
||||||
|
file1 = testdatadir / 'XRP_ETH-1m.json'
|
||||||
|
file5 = testdatadir / 'XRP_ETH-5m.json'
|
||||||
|
# Compare downloaded dataset with converted dataset
|
||||||
|
dfbak_1m = history.load_pair_history(datadir=testdatadir,
|
||||||
|
ticker_interval="1m",
|
||||||
|
pair=pair)
|
||||||
|
dfbak_5m = history.load_pair_history(datadir=testdatadir,
|
||||||
|
ticker_interval="5m",
|
||||||
|
pair=pair)
|
||||||
|
|
||||||
|
_backup_file(file1, copy_file=True)
|
||||||
|
_backup_file(file5)
|
||||||
|
|
||||||
|
tr = TimeRange.parse_timerange('20191011-20191012')
|
||||||
|
|
||||||
|
convert_trades_to_ohlcv([pair], timeframes=['1m', '5m'],
|
||||||
|
datadir=testdatadir, timerange=tr, erase=True)
|
||||||
|
|
||||||
|
assert log_has("Deleting existing data for pair XRP/ETH, interval 1m.", caplog)
|
||||||
|
# Load new data
|
||||||
|
df_1m = history.load_pair_history(datadir=testdatadir,
|
||||||
|
ticker_interval="1m",
|
||||||
|
pair=pair)
|
||||||
|
df_5m = history.load_pair_history(datadir=testdatadir,
|
||||||
|
ticker_interval="5m",
|
||||||
|
pair=pair)
|
||||||
|
|
||||||
|
assert df_1m.equals(dfbak_1m)
|
||||||
|
assert df_5m.equals(dfbak_5m)
|
||||||
|
|
||||||
|
_clean_test_file(file1)
|
||||||
|
_clean_test_file(file5)
|
||||||
|
|
|
@ -144,6 +144,12 @@ def test_exchange_resolver(default_conf, mocker, caplog):
|
||||||
assert not log_has_re(r"No .* specific subclass found. Using the generic class instead.",
|
assert not log_has_re(r"No .* specific subclass found. Using the generic class instead.",
|
||||||
caplog)
|
caplog)
|
||||||
|
|
||||||
|
# Test mapping
|
||||||
|
exchange = ExchangeResolver('binanceus', default_conf).exchange
|
||||||
|
assert isinstance(exchange, Exchange)
|
||||||
|
assert isinstance(exchange, Binance)
|
||||||
|
assert not isinstance(exchange, Kraken)
|
||||||
|
|
||||||
|
|
||||||
def test_validate_order_time_in_force(default_conf, mocker, caplog):
|
def test_validate_order_time_in_force(default_conf, mocker, caplog):
|
||||||
caplog.set_level(logging.INFO)
|
caplog.set_level(logging.INFO)
|
||||||
|
@ -1138,6 +1144,13 @@ async def test__async_get_candle_history(default_conf, mocker, caplog, exchange_
|
||||||
await exchange._async_get_candle_history(pair, "5m",
|
await exchange._async_get_candle_history(pair, "5m",
|
||||||
(arrow.utcnow().timestamp - 2000) * 1000)
|
(arrow.utcnow().timestamp - 2000) * 1000)
|
||||||
|
|
||||||
|
with pytest.raises(OperationalException, match=r'Exchange.* does not support fetching '
|
||||||
|
r'historical candlestick data\..*'):
|
||||||
|
api_mock.fetch_ohlcv = MagicMock(side_effect=ccxt.NotSupported("Not supported"))
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, api_mock, id=exchange_name)
|
||||||
|
await exchange._async_get_candle_history(pair, "5m",
|
||||||
|
(arrow.utcnow().timestamp - 2000) * 1000)
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test__async_get_candle_history_empty(default_conf, mocker, caplog):
|
async def test__async_get_candle_history_empty(default_conf, mocker, caplog):
|
||||||
|
@ -1309,6 +1322,196 @@ async def test___async_get_candle_history_sort(default_conf, mocker, exchange_na
|
||||||
assert ticks[9][5] == 2.31452783
|
assert ticks[9][5] == 2.31452783
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
async def test__async_fetch_trades(default_conf, mocker, caplog, exchange_name,
|
||||||
|
trades_history):
|
||||||
|
|
||||||
|
caplog.set_level(logging.DEBUG)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
# Monkey-patch async function
|
||||||
|
exchange._api_async.fetch_trades = get_mock_coro(trades_history)
|
||||||
|
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
res = await exchange._async_fetch_trades(pair, since=None, params=None)
|
||||||
|
assert type(res) is list
|
||||||
|
assert isinstance(res[0], dict)
|
||||||
|
assert isinstance(res[1], dict)
|
||||||
|
|
||||||
|
assert exchange._api_async.fetch_trades.call_count == 1
|
||||||
|
assert exchange._api_async.fetch_trades.call_args[0][0] == pair
|
||||||
|
assert exchange._api_async.fetch_trades.call_args[1]['limit'] == 1000
|
||||||
|
|
||||||
|
assert log_has_re(f"Fetching trades for pair {pair}, since .*", caplog)
|
||||||
|
caplog.clear()
|
||||||
|
exchange._api_async.fetch_trades.reset_mock()
|
||||||
|
res = await exchange._async_fetch_trades(pair, since=None, params={'from': '123'})
|
||||||
|
assert exchange._api_async.fetch_trades.call_count == 1
|
||||||
|
assert exchange._api_async.fetch_trades.call_args[0][0] == pair
|
||||||
|
assert exchange._api_async.fetch_trades.call_args[1]['limit'] == 1000
|
||||||
|
assert exchange._api_async.fetch_trades.call_args[1]['params'] == {'from': '123'}
|
||||||
|
assert log_has_re(f"Fetching trades for pair {pair}, params: .*", caplog)
|
||||||
|
|
||||||
|
exchange = Exchange(default_conf)
|
||||||
|
await async_ccxt_exception(mocker, default_conf, MagicMock(),
|
||||||
|
"_async_fetch_trades", "fetch_trades",
|
||||||
|
pair='ABCD/BTC', since=None)
|
||||||
|
|
||||||
|
api_mock = MagicMock()
|
||||||
|
with pytest.raises(OperationalException, match=r'Could not fetch trade data*'):
|
||||||
|
api_mock.fetch_trades = MagicMock(side_effect=ccxt.BaseError("Unknown error"))
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, api_mock, id=exchange_name)
|
||||||
|
await exchange._async_fetch_trades(pair, since=(arrow.utcnow().timestamp - 2000) * 1000)
|
||||||
|
|
||||||
|
with pytest.raises(OperationalException, match=r'Exchange.* does not support fetching '
|
||||||
|
r'historical trade data\..*'):
|
||||||
|
api_mock.fetch_trades = MagicMock(side_effect=ccxt.NotSupported("Not supported"))
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, api_mock, id=exchange_name)
|
||||||
|
await exchange._async_fetch_trades(pair, since=(arrow.utcnow().timestamp - 2000) * 1000)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
async def test__async_get_trade_history_id(default_conf, mocker, caplog, exchange_name,
|
||||||
|
trades_history):
|
||||||
|
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
pagination_arg = exchange._trades_pagination_arg
|
||||||
|
|
||||||
|
async def mock_get_trade_hist(pair, *args, **kwargs):
|
||||||
|
if 'since' in kwargs:
|
||||||
|
# Return first 3
|
||||||
|
return trades_history[:-2]
|
||||||
|
elif kwargs.get('params', {}).get(pagination_arg) == trades_history[-3]['id']:
|
||||||
|
# Return 2
|
||||||
|
return trades_history[-3:-1]
|
||||||
|
else:
|
||||||
|
# Return last 2
|
||||||
|
return trades_history[-2:]
|
||||||
|
# Monkey-patch async function
|
||||||
|
exchange._async_fetch_trades = MagicMock(side_effect=mock_get_trade_hist)
|
||||||
|
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
ret = await exchange._async_get_trade_history_id(pair, since=trades_history[0]["timestamp"],
|
||||||
|
until=trades_history[-1]["timestamp"]-1)
|
||||||
|
assert type(ret) is tuple
|
||||||
|
assert ret[0] == pair
|
||||||
|
assert type(ret[1]) is list
|
||||||
|
assert len(ret[1]) == len(trades_history)
|
||||||
|
assert exchange._async_fetch_trades.call_count == 3
|
||||||
|
fetch_trades_cal = exchange._async_fetch_trades.call_args_list
|
||||||
|
# first call (using since, not fromId)
|
||||||
|
assert fetch_trades_cal[0][0][0] == pair
|
||||||
|
assert fetch_trades_cal[0][1]['since'] == trades_history[0]["timestamp"]
|
||||||
|
|
||||||
|
# 2nd call
|
||||||
|
assert fetch_trades_cal[1][0][0] == pair
|
||||||
|
assert 'params' in fetch_trades_cal[1][1]
|
||||||
|
assert exchange._ft_has['trades_pagination_arg'] in fetch_trades_cal[1][1]['params']
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
async def test__async_get_trade_history_time(default_conf, mocker, caplog, exchange_name,
|
||||||
|
trades_history):
|
||||||
|
|
||||||
|
caplog.set_level(logging.DEBUG)
|
||||||
|
|
||||||
|
async def mock_get_trade_hist(pair, *args, **kwargs):
|
||||||
|
if kwargs['since'] == trades_history[0]["timestamp"]:
|
||||||
|
return trades_history[:-1]
|
||||||
|
else:
|
||||||
|
return trades_history[-1:]
|
||||||
|
|
||||||
|
caplog.set_level(logging.DEBUG)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
# Monkey-patch async function
|
||||||
|
exchange._async_fetch_trades = MagicMock(side_effect=mock_get_trade_hist)
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
ret = await exchange._async_get_trade_history_time(pair, since=trades_history[0]["timestamp"],
|
||||||
|
until=trades_history[-1]["timestamp"]-1)
|
||||||
|
assert type(ret) is tuple
|
||||||
|
assert ret[0] == pair
|
||||||
|
assert type(ret[1]) is list
|
||||||
|
assert len(ret[1]) == len(trades_history)
|
||||||
|
assert exchange._async_fetch_trades.call_count == 2
|
||||||
|
fetch_trades_cal = exchange._async_fetch_trades.call_args_list
|
||||||
|
# first call (using since, not fromId)
|
||||||
|
assert fetch_trades_cal[0][0][0] == pair
|
||||||
|
assert fetch_trades_cal[0][1]['since'] == trades_history[0]["timestamp"]
|
||||||
|
|
||||||
|
# 2nd call
|
||||||
|
assert fetch_trades_cal[1][0][0] == pair
|
||||||
|
assert fetch_trades_cal[0][1]['since'] == trades_history[0]["timestamp"]
|
||||||
|
assert log_has_re(r"Stopping because until was reached.*", caplog)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
async def test__async_get_trade_history_time_empty(default_conf, mocker, caplog, exchange_name,
|
||||||
|
trades_history):
|
||||||
|
|
||||||
|
caplog.set_level(logging.DEBUG)
|
||||||
|
|
||||||
|
async def mock_get_trade_hist(pair, *args, **kwargs):
|
||||||
|
if kwargs['since'] == trades_history[0]["timestamp"]:
|
||||||
|
return trades_history[:-1]
|
||||||
|
else:
|
||||||
|
return []
|
||||||
|
|
||||||
|
caplog.set_level(logging.DEBUG)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
# Monkey-patch async function
|
||||||
|
exchange._async_fetch_trades = MagicMock(side_effect=mock_get_trade_hist)
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
ret = await exchange._async_get_trade_history_time(pair, since=trades_history[0]["timestamp"],
|
||||||
|
until=trades_history[-1]["timestamp"]-1)
|
||||||
|
assert type(ret) is tuple
|
||||||
|
assert ret[0] == pair
|
||||||
|
assert type(ret[1]) is list
|
||||||
|
assert len(ret[1]) == len(trades_history) - 1
|
||||||
|
assert exchange._async_fetch_trades.call_count == 2
|
||||||
|
fetch_trades_cal = exchange._async_fetch_trades.call_args_list
|
||||||
|
# first call (using since, not fromId)
|
||||||
|
assert fetch_trades_cal[0][0][0] == pair
|
||||||
|
assert fetch_trades_cal[0][1]['since'] == trades_history[0]["timestamp"]
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
def test_get_historic_trades(default_conf, mocker, caplog, exchange_name, trades_history):
|
||||||
|
mocker.patch('freqtrade.exchange.Exchange.exchange_has', return_value=True)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
|
||||||
|
exchange._async_get_trade_history_id = get_mock_coro((pair, trades_history))
|
||||||
|
exchange._async_get_trade_history_time = get_mock_coro((pair, trades_history))
|
||||||
|
ret = exchange.get_historic_trades(pair, since=trades_history[0]["timestamp"],
|
||||||
|
until=trades_history[-1]["timestamp"])
|
||||||
|
|
||||||
|
# Depending on the exchange, one or the other method should be called
|
||||||
|
assert sum([exchange._async_get_trade_history_id.call_count,
|
||||||
|
exchange._async_get_trade_history_time.call_count]) == 1
|
||||||
|
|
||||||
|
assert len(ret) == 2
|
||||||
|
assert ret[0] == pair
|
||||||
|
assert len(ret[1]) == len(trades_history)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
|
def test_get_historic_trades_notsupported(default_conf, mocker, caplog, exchange_name,
|
||||||
|
trades_history):
|
||||||
|
mocker.patch('freqtrade.exchange.Exchange.exchange_has', return_value=False)
|
||||||
|
exchange = get_patched_exchange(mocker, default_conf, id=exchange_name)
|
||||||
|
|
||||||
|
pair = 'ETH/BTC'
|
||||||
|
|
||||||
|
with pytest.raises(OperationalException,
|
||||||
|
match="This exchange does not suport downloading Trades."):
|
||||||
|
exchange.get_historic_trades(pair, since=trades_history[0]["timestamp"],
|
||||||
|
until=trades_history[-1]["timestamp"])
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
@pytest.mark.parametrize("exchange_name", EXCHANGES)
|
||||||
def test_cancel_order_dry_run(default_conf, mocker, exchange_name):
|
def test_cancel_order_dry_run(default_conf, mocker, exchange_name):
|
||||||
default_conf['dry_run'] = True
|
default_conf['dry_run'] = True
|
||||||
|
@ -1455,13 +1658,17 @@ def test_merge_ft_has_dict(default_conf, mocker):
|
||||||
assert ex._ft_has == Exchange._ft_has_default
|
assert ex._ft_has == Exchange._ft_has_default
|
||||||
|
|
||||||
ex = Kraken(default_conf)
|
ex = Kraken(default_conf)
|
||||||
assert ex._ft_has == Exchange._ft_has_default
|
assert ex._ft_has != Exchange._ft_has_default
|
||||||
|
assert ex._ft_has['trades_pagination'] == 'id'
|
||||||
|
assert ex._ft_has['trades_pagination_arg'] == 'since'
|
||||||
|
|
||||||
# Binance defines different values
|
# Binance defines different values
|
||||||
ex = Binance(default_conf)
|
ex = Binance(default_conf)
|
||||||
assert ex._ft_has != Exchange._ft_has_default
|
assert ex._ft_has != Exchange._ft_has_default
|
||||||
assert ex._ft_has['stoploss_on_exchange']
|
assert ex._ft_has['stoploss_on_exchange']
|
||||||
assert ex._ft_has['order_time_in_force'] == ['gtc', 'fok', 'ioc']
|
assert ex._ft_has['order_time_in_force'] == ['gtc', 'fok', 'ioc']
|
||||||
|
assert ex._ft_has['trades_pagination'] == 'id'
|
||||||
|
assert ex._ft_has['trades_pagination_arg'] == 'fromId'
|
||||||
|
|
||||||
conf = copy.deepcopy(default_conf)
|
conf = copy.deepcopy(default_conf)
|
||||||
conf['exchange']['_ft_has_params'] = {"DeadBeef": 20,
|
conf['exchange']['_ft_has_params'] = {"DeadBeef": 20,
|
||||||
|
|
|
@ -49,7 +49,7 @@ def trim_dictlist(dict_list, num):
|
||||||
|
|
||||||
|
|
||||||
def load_data_test(what, testdatadir):
|
def load_data_test(what, testdatadir):
|
||||||
timerange = TimeRange(None, 'line', 0, -101)
|
timerange = TimeRange.parse_timerange('1510694220-1510700340')
|
||||||
pair = history.load_tickerdata_file(testdatadir, ticker_interval='1m',
|
pair = history.load_tickerdata_file(testdatadir, ticker_interval='1m',
|
||||||
pair='UNITTEST/BTC', timerange=timerange)
|
pair='UNITTEST/BTC', timerange=timerange)
|
||||||
datalen = len(pair)
|
datalen = len(pair)
|
||||||
|
@ -342,7 +342,8 @@ def test_tickerdata_with_fee(default_conf, mocker, testdatadir) -> None:
|
||||||
|
|
||||||
def test_tickerdata_to_dataframe_bt(default_conf, mocker, testdatadir) -> None:
|
def test_tickerdata_to_dataframe_bt(default_conf, mocker, testdatadir) -> None:
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
timerange = TimeRange(None, 'line', 0, -100)
|
# timerange = TimeRange(None, 'line', 0, -100)
|
||||||
|
timerange = TimeRange.parse_timerange('1510694220-1510700340')
|
||||||
tick = history.load_tickerdata_file(testdatadir, 'UNITTEST/BTC', '1m', timerange=timerange)
|
tick = history.load_tickerdata_file(testdatadir, 'UNITTEST/BTC', '1m', timerange=timerange)
|
||||||
tickerlist = {'UNITTEST/BTC': parse_ticker_dataframe(tick, '1m', pair="UNITTEST/BTC",
|
tickerlist = {'UNITTEST/BTC': parse_ticker_dataframe(tick, '1m', pair="UNITTEST/BTC",
|
||||||
fill_missing=True)}
|
fill_missing=True)}
|
||||||
|
@ -474,7 +475,7 @@ def test_backtesting_start(default_conf, mocker, testdatadir, caplog) -> None:
|
||||||
default_conf['ticker_interval'] = '1m'
|
default_conf['ticker_interval'] = '1m'
|
||||||
default_conf['datadir'] = testdatadir
|
default_conf['datadir'] = testdatadir
|
||||||
default_conf['export'] = None
|
default_conf['export'] = None
|
||||||
default_conf['timerange'] = '-100'
|
default_conf['timerange'] = '-1510694220'
|
||||||
|
|
||||||
backtesting = Backtesting(default_conf)
|
backtesting = Backtesting(default_conf)
|
||||||
backtesting.start()
|
backtesting.start()
|
||||||
|
@ -522,7 +523,7 @@ def test_backtest(default_conf, fee, mocker, testdatadir) -> None:
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
backtesting = Backtesting(default_conf)
|
backtesting = Backtesting(default_conf)
|
||||||
pair = 'UNITTEST/BTC'
|
pair = 'UNITTEST/BTC'
|
||||||
timerange = TimeRange(None, 'line', 0, -201)
|
timerange = TimeRange('date', None, 1517227800, 0)
|
||||||
data = history.load_data(datadir=testdatadir, ticker_interval='5m', pairs=['UNITTEST/BTC'],
|
data = history.load_data(datadir=testdatadir, ticker_interval='5m', pairs=['UNITTEST/BTC'],
|
||||||
timerange=timerange)
|
timerange=timerange)
|
||||||
data_processed = backtesting.strategy.tickerdata_to_dataframe(data)
|
data_processed = backtesting.strategy.tickerdata_to_dataframe(data)
|
||||||
|
@ -578,7 +579,7 @@ def test_backtest_1min_ticker_interval(default_conf, fee, mocker, testdatadir) -
|
||||||
backtesting = Backtesting(default_conf)
|
backtesting = Backtesting(default_conf)
|
||||||
|
|
||||||
# Run a backtesting for an exiting 1min ticker_interval
|
# Run a backtesting for an exiting 1min ticker_interval
|
||||||
timerange = TimeRange(None, 'line', 0, -200)
|
timerange = TimeRange.parse_timerange('1510688220-1510700340')
|
||||||
data = history.load_data(datadir=testdatadir, ticker_interval='1m', pairs=['UNITTEST/BTC'],
|
data = history.load_data(datadir=testdatadir, ticker_interval='1m', pairs=['UNITTEST/BTC'],
|
||||||
timerange=timerange)
|
timerange=timerange)
|
||||||
processed = backtesting.strategy.tickerdata_to_dataframe(data)
|
processed = backtesting.strategy.tickerdata_to_dataframe(data)
|
||||||
|
@ -823,7 +824,7 @@ def test_backtest_start_timerange(default_conf, mocker, caplog, testdatadir):
|
||||||
'--datadir', str(testdatadir),
|
'--datadir', str(testdatadir),
|
||||||
'backtesting',
|
'backtesting',
|
||||||
'--ticker-interval', '1m',
|
'--ticker-interval', '1m',
|
||||||
'--timerange', '-100',
|
'--timerange', '1510694220-1510700340',
|
||||||
'--enable-position-stacking',
|
'--enable-position-stacking',
|
||||||
'--disable-max-market-positions'
|
'--disable-max-market-positions'
|
||||||
]
|
]
|
||||||
|
@ -833,7 +834,7 @@ def test_backtest_start_timerange(default_conf, mocker, caplog, testdatadir):
|
||||||
exists = [
|
exists = [
|
||||||
'Parameter -i/--ticker-interval detected ... Using ticker_interval: 1m ...',
|
'Parameter -i/--ticker-interval detected ... Using ticker_interval: 1m ...',
|
||||||
'Ignoring max_open_trades (--disable-max-market-positions was used) ...',
|
'Ignoring max_open_trades (--disable-max-market-positions was used) ...',
|
||||||
'Parameter --timerange detected: -100 ...',
|
'Parameter --timerange detected: 1510694220-1510700340 ...',
|
||||||
f'Using data directory: {testdatadir} ...',
|
f'Using data directory: {testdatadir} ...',
|
||||||
'Using stake_currency: BTC ...',
|
'Using stake_currency: BTC ...',
|
||||||
'Using stake_amount: 0.001 ...',
|
'Using stake_amount: 0.001 ...',
|
||||||
|
@ -869,7 +870,7 @@ def test_backtest_start_multi_strat(default_conf, mocker, caplog, testdatadir):
|
||||||
'--datadir', str(testdatadir),
|
'--datadir', str(testdatadir),
|
||||||
'backtesting',
|
'backtesting',
|
||||||
'--ticker-interval', '1m',
|
'--ticker-interval', '1m',
|
||||||
'--timerange', '-100',
|
'--timerange', '1510694220-1510700340',
|
||||||
'--enable-position-stacking',
|
'--enable-position-stacking',
|
||||||
'--disable-max-market-positions',
|
'--disable-max-market-positions',
|
||||||
'--strategy-list',
|
'--strategy-list',
|
||||||
|
@ -887,7 +888,7 @@ def test_backtest_start_multi_strat(default_conf, mocker, caplog, testdatadir):
|
||||||
exists = [
|
exists = [
|
||||||
'Parameter -i/--ticker-interval detected ... Using ticker_interval: 1m ...',
|
'Parameter -i/--ticker-interval detected ... Using ticker_interval: 1m ...',
|
||||||
'Ignoring max_open_trades (--disable-max-market-positions was used) ...',
|
'Ignoring max_open_trades (--disable-max-market-positions was used) ...',
|
||||||
'Parameter --timerange detected: -100 ...',
|
'Parameter --timerange detected: 1510694220-1510700340 ...',
|
||||||
f'Using data directory: {testdatadir} ...',
|
f'Using data directory: {testdatadir} ...',
|
||||||
'Using stake_currency: BTC ...',
|
'Using stake_currency: BTC ...',
|
||||||
'Using stake_amount: 0.001 ...',
|
'Using stake_amount: 0.001 ...',
|
||||||
|
|
|
@ -12,7 +12,7 @@ from freqtrade import OperationalException
|
||||||
from freqtrade.data.converter import parse_ticker_dataframe
|
from freqtrade.data.converter import parse_ticker_dataframe
|
||||||
from freqtrade.data.history import load_tickerdata_file
|
from freqtrade.data.history import load_tickerdata_file
|
||||||
from freqtrade.optimize import setup_configuration, start_hyperopt
|
from freqtrade.optimize import setup_configuration, start_hyperopt
|
||||||
from freqtrade.optimize.default_hyperopt import DefaultHyperOpts
|
from freqtrade.optimize.default_hyperopt import DefaultHyperOpt
|
||||||
from freqtrade.optimize.default_hyperopt_loss import DefaultHyperOptLoss
|
from freqtrade.optimize.default_hyperopt_loss import DefaultHyperOptLoss
|
||||||
from freqtrade.optimize.hyperopt import Hyperopt
|
from freqtrade.optimize.hyperopt import Hyperopt
|
||||||
from freqtrade.resolvers.hyperopt_resolver import (HyperOptLossResolver,
|
from freqtrade.resolvers.hyperopt_resolver import (HyperOptLossResolver,
|
||||||
|
@ -148,12 +148,12 @@ def test_setup_hyperopt_configuration_with_arguments(mocker, default_conf, caplo
|
||||||
def test_hyperoptresolver(mocker, default_conf, caplog) -> None:
|
def test_hyperoptresolver(mocker, default_conf, caplog) -> None:
|
||||||
patched_configuration_load_config_file(mocker, default_conf)
|
patched_configuration_load_config_file(mocker, default_conf)
|
||||||
|
|
||||||
hyperopts = DefaultHyperOpts
|
hyperopt = DefaultHyperOpt
|
||||||
delattr(hyperopts, 'populate_buy_trend')
|
delattr(hyperopt, 'populate_buy_trend')
|
||||||
delattr(hyperopts, 'populate_sell_trend')
|
delattr(hyperopt, 'populate_sell_trend')
|
||||||
mocker.patch(
|
mocker.patch(
|
||||||
'freqtrade.resolvers.hyperopt_resolver.HyperOptResolver._load_hyperopt',
|
'freqtrade.resolvers.hyperopt_resolver.HyperOptResolver._load_hyperopt',
|
||||||
MagicMock(return_value=hyperopts(default_conf))
|
MagicMock(return_value=hyperopt(default_conf))
|
||||||
)
|
)
|
||||||
x = HyperOptResolver(default_conf, ).hyperopt
|
x = HyperOptResolver(default_conf, ).hyperopt
|
||||||
assert not hasattr(x, 'populate_buy_trend')
|
assert not hasattr(x, 'populate_buy_trend')
|
||||||
|
|
|
@ -106,7 +106,7 @@ def test_get_signal_handles_exceptions(mocker, default_conf):
|
||||||
def test_tickerdata_to_dataframe(default_conf, testdatadir) -> None:
|
def test_tickerdata_to_dataframe(default_conf, testdatadir) -> None:
|
||||||
strategy = DefaultStrategy(default_conf)
|
strategy = DefaultStrategy(default_conf)
|
||||||
|
|
||||||
timerange = TimeRange(None, 'line', 0, -100)
|
timerange = TimeRange.parse_timerange('1510694220-1510700340')
|
||||||
tick = load_tickerdata_file(testdatadir, 'UNITTEST/BTC', '1m', timerange=timerange)
|
tick = load_tickerdata_file(testdatadir, 'UNITTEST/BTC', '1m', timerange=timerange)
|
||||||
tickerlist = {'UNITTEST/BTC': parse_ticker_dataframe(tick, '1m', pair="UNITTEST/BTC",
|
tickerlist = {'UNITTEST/BTC': parse_ticker_dataframe(tick, '1m', pair="UNITTEST/BTC",
|
||||||
fill_missing=True)}
|
fill_missing=True)}
|
||||||
|
|
|
@ -1,6 +1,5 @@
|
||||||
# pragma pylint: disable=missing-docstring, protected-access, C0103
|
# pragma pylint: disable=missing-docstring, protected-access, C0103
|
||||||
import logging
|
import logging
|
||||||
import tempfile
|
|
||||||
import warnings
|
import warnings
|
||||||
from base64 import urlsafe_b64encode
|
from base64 import urlsafe_b64encode
|
||||||
from os import path
|
from os import path
|
||||||
|
@ -39,7 +38,7 @@ def test_search_strategy():
|
||||||
def test_load_strategy(default_conf, result):
|
def test_load_strategy(default_conf, result):
|
||||||
default_conf.update({'strategy': 'SampleStrategy'})
|
default_conf.update({'strategy': 'SampleStrategy'})
|
||||||
resolver = StrategyResolver(default_conf)
|
resolver = StrategyResolver(default_conf)
|
||||||
assert 'adx' in resolver.strategy.advise_indicators(result, {'pair': 'ETH/BTC'})
|
assert 'rsi' in resolver.strategy.advise_indicators(result, {'pair': 'ETH/BTC'})
|
||||||
|
|
||||||
|
|
||||||
def test_load_strategy_base64(result, caplog, default_conf):
|
def test_load_strategy_base64(result, caplog, default_conf):
|
||||||
|
@ -48,10 +47,10 @@ def test_load_strategy_base64(result, caplog, default_conf):
|
||||||
default_conf.update({'strategy': 'SampleStrategy:{}'.format(encoded_string)})
|
default_conf.update({'strategy': 'SampleStrategy:{}'.format(encoded_string)})
|
||||||
|
|
||||||
resolver = StrategyResolver(default_conf)
|
resolver = StrategyResolver(default_conf)
|
||||||
assert 'adx' in resolver.strategy.advise_indicators(result, {'pair': 'ETH/BTC'})
|
assert 'rsi' in resolver.strategy.advise_indicators(result, {'pair': 'ETH/BTC'})
|
||||||
# Make sure strategy was loaded from base64 (using temp directory)!!
|
# Make sure strategy was loaded from base64 (using temp directory)!!
|
||||||
assert log_has_re(r"Using resolved strategy SampleStrategy from '"
|
assert log_has_re(r"Using resolved strategy SampleStrategy from '"
|
||||||
+ tempfile.gettempdir() + r"/.*/SampleStrategy\.py'\.\.\.", caplog)
|
r".*(/|\\).*(/|\\)SampleStrategy\.py'\.\.\.", caplog)
|
||||||
|
|
||||||
|
|
||||||
def test_load_strategy_invalid_directory(result, caplog, default_conf):
|
def test_load_strategy_invalid_directory(result, caplog, default_conf):
|
||||||
|
|
|
@ -399,7 +399,7 @@ def test_setup_configuration_with_arguments(mocker, default_conf, caplog) -> Non
|
||||||
assert 'pair_whitelist' in config['exchange']
|
assert 'pair_whitelist' in config['exchange']
|
||||||
assert 'datadir' in config
|
assert 'datadir' in config
|
||||||
assert log_has('Using data directory: {} ...'.format("/foo/bar"), caplog)
|
assert log_has('Using data directory: {} ...'.format("/foo/bar"), caplog)
|
||||||
assert log_has('Using user-data directory: {} ...'.format("/tmp/freqtrade"), caplog)
|
assert log_has('Using user-data directory: {} ...'.format(Path("/tmp/freqtrade")), caplog)
|
||||||
assert 'user_data_dir' in config
|
assert 'user_data_dir' in config
|
||||||
|
|
||||||
assert 'ticker_interval' in config
|
assert 'ticker_interval' in config
|
||||||
|
@ -652,9 +652,9 @@ def test_create_userdata_dir(mocker, default_conf, caplog) -> None:
|
||||||
x = create_userdata_dir('/tmp/bar', create_dir=True)
|
x = create_userdata_dir('/tmp/bar', create_dir=True)
|
||||||
assert md.call_count == 7
|
assert md.call_count == 7
|
||||||
assert md.call_args[1]['parents'] is False
|
assert md.call_args[1]['parents'] is False
|
||||||
assert log_has('Created user-data directory: /tmp/bar', caplog)
|
assert log_has(f'Created user-data directory: {Path("/tmp/bar")}', caplog)
|
||||||
assert isinstance(x, Path)
|
assert isinstance(x, Path)
|
||||||
assert str(x) == "/tmp/bar"
|
assert str(x) == str(Path("/tmp/bar"))
|
||||||
|
|
||||||
|
|
||||||
def test_create_userdata_dir_exists(mocker, default_conf, caplog) -> None:
|
def test_create_userdata_dir_exists(mocker, default_conf, caplog) -> None:
|
||||||
|
@ -669,7 +669,8 @@ def test_create_userdata_dir_exists_exception(mocker, default_conf, caplog) -> N
|
||||||
mocker.patch.object(Path, "is_dir", MagicMock(return_value=False))
|
mocker.patch.object(Path, "is_dir", MagicMock(return_value=False))
|
||||||
md = mocker.patch.object(Path, 'mkdir', MagicMock())
|
md = mocker.patch.object(Path, 'mkdir', MagicMock())
|
||||||
|
|
||||||
with pytest.raises(OperationalException, match=r'Directory `/tmp/bar` does not exist.*'):
|
with pytest.raises(OperationalException,
|
||||||
|
match=r'Directory `.{1,2}tmp.{1,2}bar` does not exist.*'):
|
||||||
create_userdata_dir('/tmp/bar', create_dir=False)
|
create_userdata_dir('/tmp/bar', create_dir=False)
|
||||||
assert md.call_count == 0
|
assert md.call_count == 0
|
||||||
|
|
||||||
|
|
|
@ -1449,7 +1449,7 @@ def test_tsl_on_exchange_compatible_with_edge(mocker, edge_conf, fee, caplog,
|
||||||
# setting stoploss
|
# setting stoploss
|
||||||
freqtrade.strategy.stoploss = -0.02
|
freqtrade.strategy.stoploss = -0.02
|
||||||
|
|
||||||
# setting stoploss_on_exchange_interval to 0 second
|
# setting stoploss_on_exchange_interval to 0 seconds
|
||||||
freqtrade.strategy.order_types['stoploss_on_exchange_interval'] = 0
|
freqtrade.strategy.order_types['stoploss_on_exchange_interval'] = 0
|
||||||
|
|
||||||
patch_get_signal(freqtrade)
|
patch_get_signal(freqtrade)
|
||||||
|
@ -1678,7 +1678,7 @@ def test_update_trade_state_exception(mocker, default_conf,
|
||||||
# Test raise of OperationalException exception
|
# Test raise of OperationalException exception
|
||||||
mocker.patch(
|
mocker.patch(
|
||||||
'freqtrade.freqtradebot.FreqtradeBot.get_real_amount',
|
'freqtrade.freqtradebot.FreqtradeBot.get_real_amount',
|
||||||
side_effect=OperationalException()
|
side_effect=DependencyException()
|
||||||
)
|
)
|
||||||
freqtrade.update_trade_state(trade)
|
freqtrade.update_trade_state(trade)
|
||||||
assert log_has('Could not update trade amount: ', caplog)
|
assert log_has('Could not update trade amount: ', caplog)
|
||||||
|
@ -1916,7 +1916,8 @@ def test_close_trade(default_conf, ticker, limit_buy_order, limit_sell_order,
|
||||||
freqtrade.handle_trade(trade)
|
freqtrade.handle_trade(trade)
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_timedout_buy(default_conf, ticker, limit_buy_order_old, fee, mocker) -> None:
|
def test_check_handle_timedout_buy(default_conf, ticker, limit_buy_order_old, open_trade,
|
||||||
|
fee, mocker) -> None:
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock()
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
|
@ -1929,31 +1930,18 @@ def test_check_handle_timedout_buy(default_conf, ticker, limit_buy_order_old, fe
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_buy = Trade(
|
Trade.session.add(open_trade)
|
||||||
pair='ETH/BTC',
|
|
||||||
open_rate=0.00001099,
|
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=True
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_buy)
|
|
||||||
|
|
||||||
# check it does cancel buy orders over the time limit
|
# check it does cancel buy orders over the time limit
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 1
|
assert cancel_order_mock.call_count == 1
|
||||||
assert rpc_mock.call_count == 1
|
assert rpc_mock.call_count == 1
|
||||||
trades = Trade.query.filter(Trade.open_order_id.is_(trade_buy.open_order_id)).all()
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
nb_trades = len(trades)
|
nb_trades = len(trades)
|
||||||
assert nb_trades == 0
|
assert nb_trades == 0
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_cancelled_buy(default_conf, ticker, limit_buy_order_old,
|
def test_check_handle_cancelled_buy(default_conf, ticker, limit_buy_order_old, open_trade,
|
||||||
fee, mocker, caplog) -> None:
|
fee, mocker, caplog) -> None:
|
||||||
""" Handle Buy order cancelled on exchange"""
|
""" Handle Buy order cancelled on exchange"""
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
|
@ -1969,32 +1957,19 @@ def test_check_handle_cancelled_buy(default_conf, ticker, limit_buy_order_old,
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_buy = Trade(
|
Trade.session.add(open_trade)
|
||||||
pair='ETH/BTC',
|
|
||||||
open_rate=0.00001099,
|
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=True
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_buy)
|
|
||||||
|
|
||||||
# check it does cancel buy orders over the time limit
|
# check it does cancel buy orders over the time limit
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 0
|
assert cancel_order_mock.call_count == 0
|
||||||
assert rpc_mock.call_count == 1
|
assert rpc_mock.call_count == 1
|
||||||
trades = Trade.query.filter(Trade.open_order_id.is_(trade_buy.open_order_id)).all()
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
nb_trades = len(trades)
|
nb_trades = len(trades)
|
||||||
assert nb_trades == 0
|
assert nb_trades == 0
|
||||||
assert log_has_re("Buy order canceled on Exchange for Trade.*", caplog)
|
assert log_has_re("Buy order canceled on Exchange for Trade.*", caplog)
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_timedout_buy_exception(default_conf, ticker, limit_buy_order_old,
|
def test_check_handle_timedout_buy_exception(default_conf, ticker, limit_buy_order_old, open_trade,
|
||||||
fee, mocker) -> None:
|
fee, mocker) -> None:
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock()
|
||||||
|
@ -2009,31 +1984,19 @@ def test_check_handle_timedout_buy_exception(default_conf, ticker, limit_buy_ord
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_buy = Trade(
|
Trade.session.add(open_trade)
|
||||||
pair='ETH/BTC',
|
|
||||||
open_rate=0.00001099,
|
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=True
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_buy)
|
|
||||||
|
|
||||||
# check it does cancel buy orders over the time limit
|
# check it does cancel buy orders over the time limit
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 0
|
assert cancel_order_mock.call_count == 0
|
||||||
assert rpc_mock.call_count == 0
|
assert rpc_mock.call_count == 0
|
||||||
trades = Trade.query.filter(Trade.open_order_id.is_(trade_buy.open_order_id)).all()
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
nb_trades = len(trades)
|
nb_trades = len(trades)
|
||||||
assert nb_trades == 1
|
assert nb_trades == 1
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_timedout_sell(default_conf, ticker, limit_sell_order_old, mocker) -> None:
|
def test_check_handle_timedout_sell(default_conf, ticker, limit_sell_order_old, mocker,
|
||||||
|
open_trade) -> None:
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock()
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
|
@ -2045,30 +2008,20 @@ def test_check_handle_timedout_sell(default_conf, ticker, limit_sell_order_old,
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_sell = Trade(
|
open_trade.open_date = arrow.utcnow().shift(hours=-5).datetime
|
||||||
pair='ETH/BTC',
|
open_trade.close_date = arrow.utcnow().shift(minutes=-601).datetime
|
||||||
open_rate=0.00001099,
|
open_trade.is_open = False
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(hours=-5).datetime,
|
|
||||||
close_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=False
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_sell)
|
Trade.session.add(open_trade)
|
||||||
|
|
||||||
# check it does cancel sell orders over the time limit
|
# check it does cancel sell orders over the time limit
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 1
|
assert cancel_order_mock.call_count == 1
|
||||||
assert rpc_mock.call_count == 1
|
assert rpc_mock.call_count == 1
|
||||||
assert trade_sell.is_open is True
|
assert open_trade.is_open is True
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_cancelled_sell(default_conf, ticker, limit_sell_order_old,
|
def test_check_handle_cancelled_sell(default_conf, ticker, limit_sell_order_old, open_trade,
|
||||||
mocker, caplog) -> None:
|
mocker, caplog) -> None:
|
||||||
""" Handle sell order cancelled on exchange"""
|
""" Handle sell order cancelled on exchange"""
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
|
@ -2083,34 +2036,24 @@ def test_check_handle_cancelled_sell(default_conf, ticker, limit_sell_order_old,
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_sell = Trade(
|
open_trade.open_date = arrow.utcnow().shift(hours=-5).datetime
|
||||||
pair='ETH/BTC',
|
open_trade.close_date = arrow.utcnow().shift(minutes=-601).datetime
|
||||||
open_rate=0.00001099,
|
open_trade.is_open = False
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(hours=-5).datetime,
|
|
||||||
close_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=False
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_sell)
|
Trade.session.add(open_trade)
|
||||||
|
|
||||||
# check it does cancel sell orders over the time limit
|
# check it does cancel sell orders over the time limit
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 0
|
assert cancel_order_mock.call_count == 0
|
||||||
assert rpc_mock.call_count == 1
|
assert rpc_mock.call_count == 1
|
||||||
assert trade_sell.is_open is True
|
assert open_trade.is_open is True
|
||||||
assert log_has_re("Sell order canceled on exchange for Trade.*", caplog)
|
assert log_has_re("Sell order canceled on exchange for Trade.*", caplog)
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_timedout_partial(default_conf, ticker, limit_buy_order_old_partial,
|
def test_check_handle_timedout_partial(default_conf, ticker, limit_buy_order_old_partial,
|
||||||
mocker) -> None:
|
open_trade, mocker) -> None:
|
||||||
rpc_mock = patch_RPCManager(mocker)
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock(return_value=limit_buy_order_old_partial)
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
mocker.patch.multiple(
|
mocker.patch.multiple(
|
||||||
'freqtrade.exchange.Exchange',
|
'freqtrade.exchange.Exchange',
|
||||||
|
@ -2120,33 +2063,97 @@ def test_check_handle_timedout_partial(default_conf, ticker, limit_buy_order_old
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
trade_buy = Trade(
|
Trade.session.add(open_trade)
|
||||||
pair='ETH/BTC',
|
|
||||||
open_rate=0.00001099,
|
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=arrow.utcnow().shift(minutes=-601).datetime,
|
|
||||||
is_open=True
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_buy)
|
|
||||||
|
|
||||||
# check it does cancel buy orders over the time limit
|
# check it does cancel buy orders over the time limit
|
||||||
# note this is for a partially-complete buy order
|
# note this is for a partially-complete buy order
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert cancel_order_mock.call_count == 1
|
assert cancel_order_mock.call_count == 1
|
||||||
assert rpc_mock.call_count == 1
|
assert rpc_mock.call_count == 1
|
||||||
trades = Trade.query.filter(Trade.open_order_id.is_(trade_buy.open_order_id)).all()
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
assert len(trades) == 1
|
assert len(trades) == 1
|
||||||
assert trades[0].amount == 23.0
|
assert trades[0].amount == 23.0
|
||||||
assert trades[0].stake_amount == trade_buy.open_rate * trades[0].amount
|
assert trades[0].stake_amount == open_trade.open_rate * trades[0].amount
|
||||||
|
|
||||||
|
|
||||||
def test_check_handle_timedout_exception(default_conf, ticker, mocker, caplog) -> None:
|
def test_check_handle_timedout_partial_fee(default_conf, ticker, open_trade, caplog, fee,
|
||||||
|
limit_buy_order_old_partial, trades_for_order,
|
||||||
|
limit_buy_order_old_partial_canceled, mocker) -> None:
|
||||||
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
|
cancel_order_mock = MagicMock(return_value=limit_buy_order_old_partial_canceled)
|
||||||
|
patch_exchange(mocker)
|
||||||
|
mocker.patch.multiple(
|
||||||
|
'freqtrade.exchange.Exchange',
|
||||||
|
get_ticker=ticker,
|
||||||
|
get_order=MagicMock(return_value=limit_buy_order_old_partial),
|
||||||
|
cancel_order=cancel_order_mock,
|
||||||
|
get_trades_for_order=MagicMock(return_value=trades_for_order),
|
||||||
|
)
|
||||||
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
|
assert open_trade.amount == limit_buy_order_old_partial['amount']
|
||||||
|
|
||||||
|
open_trade.fee_open = fee()
|
||||||
|
open_trade.fee_close = fee()
|
||||||
|
Trade.session.add(open_trade)
|
||||||
|
# cancelling a half-filled order should update the amount to the bought amount
|
||||||
|
# and apply fees if necessary.
|
||||||
|
freqtrade.check_handle_timedout()
|
||||||
|
|
||||||
|
assert log_has_re(r"Applying fee on amount for Trade.* Order", caplog)
|
||||||
|
|
||||||
|
assert cancel_order_mock.call_count == 1
|
||||||
|
assert rpc_mock.call_count == 1
|
||||||
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
|
assert len(trades) == 1
|
||||||
|
# Verify that tradehas been updated
|
||||||
|
assert trades[0].amount == (limit_buy_order_old_partial['amount'] -
|
||||||
|
limit_buy_order_old_partial['remaining']) - 0.0001
|
||||||
|
assert trades[0].open_order_id is None
|
||||||
|
assert trades[0].fee_open == 0
|
||||||
|
|
||||||
|
|
||||||
|
def test_check_handle_timedout_partial_except(default_conf, ticker, open_trade, caplog, fee,
|
||||||
|
limit_buy_order_old_partial, trades_for_order,
|
||||||
|
limit_buy_order_old_partial_canceled, mocker) -> None:
|
||||||
|
rpc_mock = patch_RPCManager(mocker)
|
||||||
|
cancel_order_mock = MagicMock(return_value=limit_buy_order_old_partial_canceled)
|
||||||
|
patch_exchange(mocker)
|
||||||
|
mocker.patch.multiple(
|
||||||
|
'freqtrade.exchange.Exchange',
|
||||||
|
get_ticker=ticker,
|
||||||
|
get_order=MagicMock(return_value=limit_buy_order_old_partial),
|
||||||
|
cancel_order=cancel_order_mock,
|
||||||
|
get_trades_for_order=MagicMock(return_value=trades_for_order),
|
||||||
|
)
|
||||||
|
mocker.patch('freqtrade.freqtradebot.FreqtradeBot.get_real_amount',
|
||||||
|
MagicMock(side_effect=DependencyException))
|
||||||
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
|
assert open_trade.amount == limit_buy_order_old_partial['amount']
|
||||||
|
|
||||||
|
open_trade.fee_open = fee()
|
||||||
|
open_trade.fee_close = fee()
|
||||||
|
Trade.session.add(open_trade)
|
||||||
|
# cancelling a half-filled order should update the amount to the bought amount
|
||||||
|
# and apply fees if necessary.
|
||||||
|
freqtrade.check_handle_timedout()
|
||||||
|
|
||||||
|
assert log_has_re(r"Could not update trade amount: .*", caplog)
|
||||||
|
|
||||||
|
assert cancel_order_mock.call_count == 1
|
||||||
|
assert rpc_mock.call_count == 1
|
||||||
|
trades = Trade.query.filter(Trade.open_order_id.is_(open_trade.open_order_id)).all()
|
||||||
|
assert len(trades) == 1
|
||||||
|
# Verify that tradehas been updated
|
||||||
|
|
||||||
|
assert trades[0].amount == (limit_buy_order_old_partial['amount'] -
|
||||||
|
limit_buy_order_old_partial['remaining'])
|
||||||
|
assert trades[0].open_order_id is None
|
||||||
|
assert trades[0].fee_open == fee()
|
||||||
|
|
||||||
|
|
||||||
|
def test_check_handle_timedout_exception(default_conf, ticker, open_trade, mocker, caplog) -> None:
|
||||||
patch_RPCManager(mocker)
|
patch_RPCManager(mocker)
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock()
|
||||||
|
@ -2164,34 +2171,20 @@ def test_check_handle_timedout_exception(default_conf, ticker, mocker, caplog) -
|
||||||
)
|
)
|
||||||
freqtrade = FreqtradeBot(default_conf)
|
freqtrade = FreqtradeBot(default_conf)
|
||||||
|
|
||||||
open_date = arrow.utcnow().shift(minutes=-601)
|
Trade.session.add(open_trade)
|
||||||
trade_buy = Trade(
|
|
||||||
pair='ETH/BTC',
|
|
||||||
open_rate=0.00001099,
|
|
||||||
exchange='bittrex',
|
|
||||||
open_order_id='123456789',
|
|
||||||
amount=90.99181073,
|
|
||||||
fee_open=0.0,
|
|
||||||
fee_close=0.0,
|
|
||||||
stake_amount=1,
|
|
||||||
open_date=open_date.datetime,
|
|
||||||
is_open=True
|
|
||||||
)
|
|
||||||
|
|
||||||
Trade.session.add(trade_buy)
|
|
||||||
|
|
||||||
freqtrade.check_handle_timedout()
|
freqtrade.check_handle_timedout()
|
||||||
assert log_has_re(r"Cannot query order for Trade\(id=1, pair=ETH/BTC, amount=90.99181073, "
|
assert log_has_re(r"Cannot query order for Trade\(id=1, pair=ETH/BTC, amount=90.99181073, "
|
||||||
r"open_rate=0.00001099, open_since="
|
r"open_rate=0.00001099, open_since="
|
||||||
f"{open_date.strftime('%Y-%m-%d %H:%M:%S')}"
|
f"{open_trade.open_date.strftime('%Y-%m-%d %H:%M:%S')}"
|
||||||
r"\) due to Traceback \(most recent call last\):\n*",
|
r"\) due to Traceback \(most recent call last\):\n*",
|
||||||
caplog)
|
caplog)
|
||||||
|
|
||||||
|
|
||||||
def test_handle_timedout_limit_buy(mocker, default_conf) -> None:
|
def test_handle_timedout_limit_buy(mocker, default_conf, limit_buy_order) -> None:
|
||||||
patch_RPCManager(mocker)
|
patch_RPCManager(mocker)
|
||||||
patch_exchange(mocker)
|
patch_exchange(mocker)
|
||||||
cancel_order_mock = MagicMock()
|
cancel_order_mock = MagicMock(return_value=limit_buy_order)
|
||||||
mocker.patch.multiple(
|
mocker.patch.multiple(
|
||||||
'freqtrade.exchange.Exchange',
|
'freqtrade.exchange.Exchange',
|
||||||
cancel_order=cancel_order_mock
|
cancel_order=cancel_order_mock
|
||||||
|
@ -2201,13 +2194,14 @@ def test_handle_timedout_limit_buy(mocker, default_conf) -> None:
|
||||||
|
|
||||||
Trade.session = MagicMock()
|
Trade.session = MagicMock()
|
||||||
trade = MagicMock()
|
trade = MagicMock()
|
||||||
order = {'remaining': 1,
|
limit_buy_order['remaining'] = limit_buy_order['amount']
|
||||||
'amount': 1}
|
assert freqtrade.handle_timedout_limit_buy(trade, limit_buy_order)
|
||||||
assert freqtrade.handle_timedout_limit_buy(trade, order)
|
assert cancel_order_mock.call_count == 1
|
||||||
|
|
||||||
|
cancel_order_mock.reset_mock()
|
||||||
|
limit_buy_order['amount'] = 2
|
||||||
|
assert not freqtrade.handle_timedout_limit_buy(trade, limit_buy_order)
|
||||||
assert cancel_order_mock.call_count == 1
|
assert cancel_order_mock.call_count == 1
|
||||||
order['amount'] = 2
|
|
||||||
assert not freqtrade.handle_timedout_limit_buy(trade, order)
|
|
||||||
assert cancel_order_mock.call_count == 2
|
|
||||||
|
|
||||||
|
|
||||||
def test_handle_timedout_limit_sell(mocker, default_conf) -> None:
|
def test_handle_timedout_limit_sell(mocker, default_conf) -> None:
|
||||||
|
@ -3361,7 +3355,7 @@ def test_get_real_amount_wrong_amount(default_conf, trades_for_order, buy_order_
|
||||||
patch_get_signal(freqtrade)
|
patch_get_signal(freqtrade)
|
||||||
|
|
||||||
# Amount does not change
|
# Amount does not change
|
||||||
with pytest.raises(OperationalException, match=r"Half bought\? Amounts don't match"):
|
with pytest.raises(DependencyException, match=r"Half bought\? Amounts don't match"):
|
||||||
freqtrade.get_real_amount(trade, limit_buy_order)
|
freqtrade.get_real_amount(trade, limit_buy_order)
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -214,11 +214,12 @@ def test_generate_plot_file(mocker, caplog):
|
||||||
store_plot_file(fig, filename="freqtrade-plot-UNITTEST_BTC-5m.html",
|
store_plot_file(fig, filename="freqtrade-plot-UNITTEST_BTC-5m.html",
|
||||||
directory=Path("user_data/plots"))
|
directory=Path("user_data/plots"))
|
||||||
|
|
||||||
|
expected_fn = str(Path("user_data/plots/freqtrade-plot-UNITTEST_BTC-5m.html"))
|
||||||
assert plot_mock.call_count == 1
|
assert plot_mock.call_count == 1
|
||||||
assert plot_mock.call_args[0][0] == fig
|
assert plot_mock.call_args[0][0] == fig
|
||||||
assert (plot_mock.call_args_list[0][1]['filename']
|
assert (plot_mock.call_args_list[0][1]['filename']
|
||||||
== "user_data/plots/freqtrade-plot-UNITTEST_BTC-5m.html")
|
== expected_fn)
|
||||||
assert log_has("Stored plot as user_data/plots/freqtrade-plot-UNITTEST_BTC-5m.html",
|
assert log_has(f"Stored plot as {expected_fn}",
|
||||||
caplog)
|
caplog)
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -5,9 +5,6 @@ from freqtrade.configuration import TimeRange
|
||||||
|
|
||||||
|
|
||||||
def test_parse_timerange_incorrect() -> None:
|
def test_parse_timerange_incorrect() -> None:
|
||||||
assert TimeRange(None, 'line', 0, -200) == TimeRange.parse_timerange('-200')
|
|
||||||
assert TimeRange('line', None, 200, 0) == TimeRange.parse_timerange('200-')
|
|
||||||
assert TimeRange('index', 'index', 200, 500) == TimeRange.parse_timerange('200-500')
|
|
||||||
|
|
||||||
assert TimeRange('date', None, 1274486400, 0) == TimeRange.parse_timerange('20100522-')
|
assert TimeRange('date', None, 1274486400, 0) == TimeRange.parse_timerange('20100522-')
|
||||||
assert TimeRange(None, 'date', 0, 1274486400) == TimeRange.parse_timerange('-20100522')
|
assert TimeRange(None, 'date', 0, 1274486400) == TimeRange.parse_timerange('-20100522')
|
||||||
|
@ -20,9 +17,14 @@ def test_parse_timerange_incorrect() -> None:
|
||||||
timerange = TimeRange.parse_timerange('1231006505-1233360000')
|
timerange = TimeRange.parse_timerange('1231006505-1233360000')
|
||||||
assert TimeRange('date', 'date', 1231006505, 1233360000) == timerange
|
assert TimeRange('date', 'date', 1231006505, 1233360000) == timerange
|
||||||
|
|
||||||
# TODO: Find solution for the following case (passing timestamp in ms)
|
|
||||||
timerange = TimeRange.parse_timerange('1231006505000-1233360000000')
|
timerange = TimeRange.parse_timerange('1231006505000-1233360000000')
|
||||||
assert TimeRange('date', 'date', 1231006505, 1233360000) != timerange
|
assert TimeRange('date', 'date', 1231006505, 1233360000) == timerange
|
||||||
|
|
||||||
|
timerange = TimeRange.parse_timerange('1231006505000-')
|
||||||
|
assert TimeRange('date', None, 1231006505, 0) == timerange
|
||||||
|
|
||||||
|
timerange = TimeRange.parse_timerange('-1231006505000')
|
||||||
|
assert TimeRange(None, 'date', 0, 1231006505) == timerange
|
||||||
|
|
||||||
with pytest.raises(Exception, match=r'Incorrect syntax.*'):
|
with pytest.raises(Exception, match=r'Incorrect syntax.*'):
|
||||||
TimeRange.parse_timerange('-')
|
TimeRange.parse_timerange('-')
|
||||||
|
|
|
@ -497,3 +497,25 @@ def test_download_data_no_pairs(mocker, caplog):
|
||||||
with pytest.raises(OperationalException,
|
with pytest.raises(OperationalException,
|
||||||
match=r"Downloading data requires a list of pairs\..*"):
|
match=r"Downloading data requires a list of pairs\..*"):
|
||||||
start_download_data(pargs)
|
start_download_data(pargs)
|
||||||
|
|
||||||
|
|
||||||
|
def test_download_data_trades(mocker, caplog):
|
||||||
|
dl_mock = mocker.patch('freqtrade.utils.refresh_backtest_trades_data',
|
||||||
|
MagicMock(return_value=[]))
|
||||||
|
convert_mock = mocker.patch('freqtrade.utils.convert_trades_to_ohlcv',
|
||||||
|
MagicMock(return_value=[]))
|
||||||
|
patch_exchange(mocker)
|
||||||
|
mocker.patch(
|
||||||
|
'freqtrade.exchange.Exchange.markets', PropertyMock(return_value={})
|
||||||
|
)
|
||||||
|
args = [
|
||||||
|
"download-data",
|
||||||
|
"--exchange", "kraken",
|
||||||
|
"--pairs", "ETH/BTC", "XRP/BTC",
|
||||||
|
"--days", "20",
|
||||||
|
"--dl-trades"
|
||||||
|
]
|
||||||
|
start_download_data(get_args(args))
|
||||||
|
assert dl_mock.call_args[1]['timerange'].starttype == "date"
|
||||||
|
assert dl_mock.call_count == 1
|
||||||
|
assert convert_mock.call_count == 1
|
||||||
|
|
1
tests/testdata/XRP_ETH-1m.json
vendored
Normal file
1
tests/testdata/XRP_ETH-1m.json
vendored
Normal file
File diff suppressed because one or more lines are too long
1
tests/testdata/XRP_ETH-5m.json
vendored
Normal file
1
tests/testdata/XRP_ETH-5m.json
vendored
Normal file
File diff suppressed because one or more lines are too long
BIN
tests/testdata/XRP_ETH-trades.json.gz
vendored
Normal file
BIN
tests/testdata/XRP_ETH-trades.json.gz
vendored
Normal file
Binary file not shown.
|
@ -34,9 +34,8 @@ class SampleStrategy(IStrategy):
|
||||||
# Minimal ROI designed for the strategy.
|
# Minimal ROI designed for the strategy.
|
||||||
# This attribute will be overridden if the config file contains "minimal_roi".
|
# This attribute will be overridden if the config file contains "minimal_roi".
|
||||||
minimal_roi = {
|
minimal_roi = {
|
||||||
"40": 0.0,
|
"60": 0.01,
|
||||||
"30": 0.01,
|
"30": 0.02,
|
||||||
"20": 0.02,
|
|
||||||
"0": 0.04
|
"0": 0.04
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -99,13 +98,16 @@ class SampleStrategy(IStrategy):
|
||||||
:return: a Dataframe with all mandatory indicators for the strategies
|
:return: a Dataframe with all mandatory indicators for the strategies
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Momentum Indicator
|
# Momentum Indicators
|
||||||
# ------------------------------------
|
# ------------------------------------
|
||||||
|
|
||||||
|
# RSI
|
||||||
|
dataframe['rsi'] = ta.RSI(dataframe)
|
||||||
|
|
||||||
|
"""
|
||||||
# ADX
|
# ADX
|
||||||
dataframe['adx'] = ta.ADX(dataframe)
|
dataframe['adx'] = ta.ADX(dataframe)
|
||||||
|
|
||||||
"""
|
|
||||||
# Awesome oscillator
|
# Awesome oscillator
|
||||||
dataframe['ao'] = qtpylib.awesome_oscillator(dataframe)
|
dataframe['ao'] = qtpylib.awesome_oscillator(dataframe)
|
||||||
|
|
||||||
|
@ -133,9 +135,6 @@ class SampleStrategy(IStrategy):
|
||||||
# ROC
|
# ROC
|
||||||
dataframe['roc'] = ta.ROC(dataframe)
|
dataframe['roc'] = ta.ROC(dataframe)
|
||||||
|
|
||||||
# RSI
|
|
||||||
dataframe['rsi'] = ta.RSI(dataframe)
|
|
||||||
|
|
||||||
# Inverse Fisher transform on RSI, values [-1.0, 1.0] (https://goo.gl/2JGGoy)
|
# Inverse Fisher transform on RSI, values [-1.0, 1.0] (https://goo.gl/2JGGoy)
|
||||||
rsi = 0.1 * (dataframe['rsi'] - 50)
|
rsi = 0.1 * (dataframe['rsi'] - 50)
|
||||||
dataframe['fisher_rsi'] = (numpy.exp(2 * rsi) - 1) / (numpy.exp(2 * rsi) + 1)
|
dataframe['fisher_rsi'] = (numpy.exp(2 * rsi) - 1) / (numpy.exp(2 * rsi) + 1)
|
||||||
|
@ -255,7 +254,7 @@ class SampleStrategy(IStrategy):
|
||||||
dataframe['ha_low'] = heikinashi['low']
|
dataframe['ha_low'] = heikinashi['low']
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# Retrieve best bid and best ask
|
# Retrieve best bid and best ask from the orderbook
|
||||||
# ------------------------------------
|
# ------------------------------------
|
||||||
"""
|
"""
|
||||||
# first check if dataprovider is available
|
# first check if dataprovider is available
|
||||||
|
@ -277,9 +276,9 @@ class SampleStrategy(IStrategy):
|
||||||
"""
|
"""
|
||||||
dataframe.loc[
|
dataframe.loc[
|
||||||
(
|
(
|
||||||
(dataframe['adx'] > 30) &
|
(qtpylib.crossed_above(dataframe['rsi'], 30)) & # Signal: RSI crosses above 30
|
||||||
(dataframe['tema'] <= dataframe['bb_middleband']) &
|
(dataframe['tema'] <= dataframe['bb_middleband']) & # Guard: tema below BB middle
|
||||||
(dataframe['tema'] > dataframe['tema'].shift(1)) &
|
(dataframe['tema'] > dataframe['tema'].shift(1)) & # Guard: tema is raising
|
||||||
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
||||||
),
|
),
|
||||||
'buy'] = 1
|
'buy'] = 1
|
||||||
|
@ -295,9 +294,9 @@ class SampleStrategy(IStrategy):
|
||||||
"""
|
"""
|
||||||
dataframe.loc[
|
dataframe.loc[
|
||||||
(
|
(
|
||||||
(dataframe['adx'] > 70) &
|
(qtpylib.crossed_above(dataframe['rsi'], 70)) & # Signal: RSI crosses above 70
|
||||||
(dataframe['tema'] > dataframe['bb_middleband']) &
|
(dataframe['tema'] > dataframe['bb_middleband']) & # Guard: tema above BB middle
|
||||||
(dataframe['tema'] < dataframe['tema'].shift(1)) &
|
(dataframe['tema'] < dataframe['tema'].shift(1)) & # Guard: tema is falling
|
||||||
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
(dataframe['volume'] > 0) # Make sure Volume is not 0
|
||||||
),
|
),
|
||||||
'sell'] = 1
|
'sell'] = 1
|
||||||
|
|
Loading…
Reference in New Issue
Block a user