Giter Club home page Giter Club logo

vectorbt's Introduction

vectorbt's People

Contributors

barnjamin avatar emiliobasualdo avatar forlindre avatar haxdds avatar izikeros avatar jacopomaroli avatar kalindro avatar kmcentush avatar manhinhang avatar polakowo avatar rileymshea avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

vectorbt's Issues

Please use a permissive license

Dear Polakowo,

the GPL license makes it not tenable to take beyond a prototype phase

would it be possible to relicense this with MIT or Apache-2.0 license?

Then startups could use this. It’s handy to have a fast backtesting library because financial trading is an easy test environment for AI development (clear utility function)

Thanks in advance, would love to be able to try this without legal headaches

Error of Portfolio.from_signals

Hi, thank for this lib. I got a problem when I was trying test example.

import vectorbt as vbt
import pandas as pd
import yfinance as yf

price = yf.Ticker("BTC-USD").history(period="max")
entries, exits = pd.Series.vbt.signals.generate_random_both(
     price.shape[0], n=10, seed=42
 )
 
portfolio = vbt.Portfolio.from_signals(
     price['Close'], entries, exits,
     fees=0.001,
     init_capital=100,
     freq='1D'
)

then it was always happened,
raise ValueError("Only SizeType.Shares and SizeType.Cash are supported")

my environment is:
python 3.8
pandas 1.1.1
vectorbt 0.13.7

Not show closed positions.

Hi,

Thank you for you job.
I test an example

# %%
import numpy as np
import pandas as pd
import vectorbt as vbt
from vectorbt.portfolio.enums import SizeType, CallSeqType

# %%
np.random.seed(42)
price = pd.DataFrame(np.random.uniform(1, 10, size=(5, 3)))
orders = pd.DataFrame(np.full((5, 3), 1.) / 3)  # each column 33.3%
orders[1::2] = np.nan  # skip every second tick

# %%
portfolio = vbt.Portfolio.from_orders(
    price,  # reference price for portfolio value
    orders,
    order_price=price,  # order price
    size_type=SizeType.TargetPercent,
    val_price=price,  # order price known beforehand (don't do it)
    call_seq=CallSeqType.Auto,  # first sell then buy
    group_by=np.array([0, 0, 0]),
    cash_sharing=True,
    fees=0.001, fixed_fees=1., slippage=0.001,
)

# %%
positions = vbt.Positions.from_orders(portfolio.orders())
# the output is 0
positions.closed.count()

but I test other example, the closed positions will show

price = pd.Series([1., 2., 3., 4., 3., 2., 1.])
orders = pd.Series([1., -0.5, -0.5, 1., -1., 2., -1.])
portfolio = vbt.Portfolio.from_orders(price, orders, init_cash=100., freq='1D')

# 
positions = vbt.Positions.from_orders(portfolio.orders())
# the output is 2
positions.closed.count()

maybe comparing floating number lead to it.

ModuleNotFoundError: No module named 'vectorbt.utils'

I have just installed the package. When tried to import it I got the following error:

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-4-cd35858d5c92> in <module>
----> 1 import vectorbt as vbt
      2 import numpy as np

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\__init__.py in <module>
----> 1 from vectorbt import utils, accessors, timeseries, widgets, signals, portfolio, indicators, defaults
      2 
      3 # Most important classes
      4 from vectorbt.widgets import Indicator, Bar, Scatter, Histogram, Heatmap
      5 from vectorbt.portfolio import Portfolio

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\accessors.py in <module>
      1 import pandas as pd
      2 from pandas.core.accessor import _register_accessor, DirNamesMixin
----> 3 from vectorbt.utils.accessors import Base_DFAccessor, Base_SRAccessor
      4 
      5 

ModuleNotFoundError: No module named 'vectorbt.utils'

My guess is that vectorbt.utils submodule is not exported in init?

AttributeError in BitcoinDMAC.ipynb

Hi,

I start with vectorbt,
When testing BitcoinDMAC.ipynb I get the following error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-6-ed264f0478b5> in <module>
     10 freq = '1D'
     11 
---> 12 vbt.defaults.portfolio['init_cash'] = 100. # in $
     13 vbt.defaults.portfolio['fees'] = 0.0025 # in %
     14 vbt.defaults.portfolio['slippage'] = 0.0025 # in %

AttributeError: module 'vectorbt' has no attribute 'defaults'

In the documentation I did not find vectorbt.defaults but vectorbt.settings with vectorbt.settings.portfolio.
The correction is then:

vbt.settings.portfolio['init_cash'] = 100. # in $
vbt.settings.portfolio['fees'] = 0.0025 # in %
vbt.settings.portfolio['slippage'] = 0.0025 # in %

Integration of stop with basic signals extended with run combs

I am new to your package and was hoping you might answer a general setup question:

Can you provide a quick example of how to add the stop loss logic to an existing strategy of the form :

fast_ma, slow_ma = vbt.MA.run_combs(price, window=windows, r=2, short_names=['fast', 'slow'])
entries = fast_ma.ma_above(slow_ma, crossed=True)
exits = fast_ma.ma_below(slow_ma, crossed=True)

[STOP LOSS logic here]

Such that the stop loss and normal exit logic integrate together

Syntax error when passing indicators which start with numbers to IndicatorFactory

Hi,

Just noticing the following:

vbt.IndicatorFactory(output_names=['real'], input_names=['1d_max']).from_apply_func(lambda ts: ts).run(new_indicator)

produces: {SyntaxError}invalid syntax (factory.py, line 2)

but:

vbt.IndicatorFactory(output_names=['real'], input_names=['a_1d_max']).from_apply_func(lambda ts: ts).run(new_indicator)

works fine

Not sure if it's an issue or not but seems odd enough to note.

Recent problem importing vectorbt into Google Colab

Hey there. First off, I love vectorbt, and have been using it for months. I do all of my development on google colab (colaboratory). Every time I start a session i usually run "pip install vectorbt" then "import vectorbt as vbt" and for months that has been working with no problems.

Sometime in the last 24 hours or so however i have started receiving a new error:

'''
/usr/local/lib/python3.6/dist-packages/vectorbt/settings.py in ()
61
62 # Templates
---> 63 with open(os.path.join(os.path.dirname(file), 'templates/light.json')) as json_file:
64 light_template = Config(json.load(json_file))
65 """_"""

FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.6/dist-packages/vectorbt/templates/light.json'
'''

I noticed that in your main folder here on github you now have a folder called templates with 2 json files inside. i temporarily fixed the problem by downloading these, saving them in json, then reuploading through colab's file browser into the vectorbt/ folder, then all is well.

But that isn't a very good solution because everything in the session runtime on google resets, those files get deleted, then i can't import vectorbt next time without going through that whole process again, which I don't think there is a way to automate either.

I tried installing a fresh install of the latest version 0.15.1 but for some reason the "templates" folders isn't being included. could there be something going on with what pip is grabbing to install that it's missing this folder?

Any ideas would be much appreciated. Thanks so much!

Plotly Name Arg Not Cast to String

I noticed that the name is not being cast to a string for create_scatter(). It's possible that it's also not being cast to string on the other generic plotting functions in vectorbt, but I haven't taken a look.

Steps to reproduce:

# Get prices
prices = pd.DataFrame(np.array([[1, 1.1, 1.05, 1.11, 1.12], [1.03, 1.13, 1.07, 1.12, 1.13]]).T, columns=['BTC', 'ETH'])
prices.columns.name = 'asset'
prices.index.name = 'Date'

print(prices)

# Get order target percentages
cols = pd.MultiIndex.from_product([[1, 2], [2, 3], ['BTC', 'ETH']])
tgt_pct = pd.DataFrame(np.array([[0, 0.02, 0.03, 0, 0.05], [0.01, 0.03, 0.01, 0.02, 0.04],
                                [0, 0.04, 0.01, 0.02, 0.04], [0.03, 0.05, 0, 0.02, 0.03],
                                [0.01, 0.03, 0.01, 0.02, 0.04], [0, 0.04, 0.01, 0.02, 0.04],
                                [0.03, 0.05, 0, 0.02, 0.03], [0.01, 0.03, 0.01, 0.02, 0.04],
]).T, columns=cols)
tgt_pct.columns.names = ['custom_param1', 'custom_param2', 'asset']
tgt_pct.index.name = 'Date'

print(tgt_pct)

# Align prices
prices = prices.vbt.align_to(tgt_pct)

# Run the portfolio
size_type = getattr(vbt.portfolio.enums.SizeType, 'TargetPercent')
portfolio = vbt.Portfolio.from_orders(prices, tgt_pct, size_type=size_type, freq='1D')

# Plot a subset of the trades
res = portfolio.xs((2, 3, 'BTC'), axis=1).trades.pnl.to_matrix()

# Errors
# res.vbt.scatter()

# Works
res.name = str(res.name)  # because of how we sliced the portfolio, "name" is a tuple that Plotly can't handle; casting is necessary
res.vbt.scatter()

Solution:
Change "vectorbt\generic\plotting.py" line 226 from "name=trace_name," to "name=str(trace_name),"

Would you prefer I do a PR with this (and check the other plotting functions), or would you rather merge this into some development branch yourself?

Entry/Exit Dates Not Displaying Correctly on Plot for Trades w/ get_orders() SizeType = 'TargetPercent'

Code

Here's some code to reproduce the issue:

# Get prices
prices = pd.DataFrame(np.array([[1, 1.1, 1.05, 1.11, 1.12], [1.03, 1.13, 1.07, 1.12, 1.13]]).T, columns=['BTC', 'ETH'])
prices.columns.name = 'asset'
prices.index.name = 'Date'

print(prices)

# Get order target percentages
cols = pd.MultiIndex.from_product([[1, 2], [2, 3], ['BTC', 'ETH']])
tgt_pct = pd.DataFrame(np.array([[0, 0.02, 0.03, 0, 0.05], [0.01, 0.03, 0.01, 0.02, 0.04],
                                [0, 0.04, 0.01, 0.02, 0.04], [0.03, 0.05, 0, 0.02, 0.03],
                                [0.01, 0.03, 0.01, 0.02, 0.04], [0, 0.04, 0.01, 0.02, 0.04],
                                [0.03, 0.05, 0, 0.02, 0.03], [0.01, 0.03, 0.01, 0.02, 0.04],
]).T, columns=cols)
tgt_pct.columns.names = ['custom_param1', 'custom_param2', 'asset']
tgt_pct.index.name = 'Date'

print(tgt_pct)

# Align prices
prices = prices.vbt.align_to(tgt_pct)

# Run the portfolio
size_type = getattr(vbt.portfolio.enums.SizeType, 'TargetPercent')
portfolio = vbt.Portfolio.from_orders(prices, tgt_pct, size_type=size_type, freq='1D')

# Plot a subset of the trades
portfolio.xs((2, 3, 'BTC'), axis=1).trades.plot().show_png()
portfolio.xs((2, 3, 'ETH'), axis=1).trades.plot().show_png()

Here are the plots produced by the snippet:
image
image

Issue

I'm not sure what the "proper" fix would look like for this problem, but all of the entry dates are being set to the date that the position was first held (even after closing an entire position by setting order target percent to 0). Additionally, it's not clear to me what determines an "exit" in this case. Dates where the order's target percentage is lower than the previous day is not always showing up as an exit. Ex: Date 2 for (2, 3, 'BTC') has no exit on the plot, but there is on Date 3. However, Date 2 is an exit for (2, 3, 'ETH'). Are the order target percentages for Date i being applied to the simulation on Date i, or Date i + 1? The above example makes me feel like it's both based on the plot results...

Apologies if anything above is confusing. I'm still trying to wrap my head around how the simulation takes order target percentages into effect.

Would there be any way to convey that additional/fewer holdings were taken in a position while showing the entry date as when a position changed? I guess this is more of a visualization problem than functionality problem. I believe that the sharpe, etc. are still correct for this scenario.

Cryptic Numba issues while trying to reproduce the documentation examples

Hello.

I'm trying to start working with VectorBT, but I keep getting some very weird errors from Numba while invoking the code. Given that Numba is really bad at error descriptions it is hard to say if it is my mistake while copy-pasting or if it is a version problem.

Here's the proof of concept exhibiting the issue. It is mostly a copy-paste from the documentation and the article on the stop types comparison:

#!/usr/bin/env python3

import pandas as pd
import yfinance as yf
import vectorbt as vbt
from datetime import datetime

df = yf.Ticker('BTC-USD').history(interval='1h',
                                  start=datetime(2020, 1, 1),
                                  end=datetime(2020, 12, 1))

# sort of we have multiple data frames, that doesn't seem to influence the bug
# df.columns = pd.MultiIndex.from_tuples(
#     (c, 'BTC-USD')
#     for c in df.columns
# )

fast_ma = vbt.MA.run(df['Close'], 10, short_name='fast')
slow_ma = vbt.MA.run(df['Close'], 20, short_name='slow')

entries = fast_ma.ma_above(slow_ma, crossed=True)

exits = vbt.ADVSTEX.run(
    entries,
    df['Open'],
    df['High'],
    df['Low'],
    df['Close'],
    ts_stop=[0.1],
    stop_type=None, hit_price=None
).exits

And it's output is the following:

Traceback (most recent call last):
  File "/home/naquad/projects/tests/strat2/try1.py", line 39, in <module>
    tp_exits = df['BUY'].vbt.signals.generate_stop_exits(np.array([0.2]), 0, trailing=True)
  File "/home/naquad/.local/lib/python3.9/site-packages/vectorbt/signals/accessors.py", line 469, in generate_stop_exits
    exits = nb.generate_stop_ex_nb(
  File "/home/naquad/.local/lib/python3.9/site-packages/numba/core/dispatcher.py", line 415, in _compile_for_args
    error_rewrite(e, 'typing')
  File "/home/naquad/.local/lib/python3.9/site-packages/numba/core/dispatcher.py", line 358, in error_rewrite
    reraise(type(e), e, None)
  File "/home/naquad/.local/lib/python3.9/site-packages/numba/core/utils.py", line 80, in reraise
    raise value.with_traceback(tb)
numba.core.errors.TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Internal error at <numba.core.typeinfer.CallConstraint object at 0x7f4cb1d21fd0>.
Failed in nopython mode pipeline (step: analyzing bytecode)
Use of unsupported opcode (LIST_EXTEND) found

File "../../../.local/lib/python3.9/site-packages/vectorbt/signals/nb.py", line 79:
def generate_ex_nb(entries, wait, exit_choice_func_nb, *args):
    <source elided>
                # Run the UDF
                idxs = exit_choice_func_nb(col, from_i, to_i, *args)
                ^

During: resolving callee type: type(CPUDispatcher(<function generate_ex_nb at 0x7f4cb8545790>))
During: typing of call at /home/naquad/.local/lib/python3.9/site-packages/vectorbt/signals/nb.py (471)

Enable logging at debug level for details.

File "../../../.local/lib/python3.9/site-packages/vectorbt/signals/nb.py", line 471:
def generate_stop_ex_nb(entries, ts, stop, trailing, wait, first, flex_2d):
    <source elided>
    temp_idx_arr = np.empty((entries.shape[0],), dtype=np.int_)
    return generate_ex_nb(entries, wait, stop_choice_nb, ts, stop, trailing, wait, first, temp_idx_arr, flex_2d)
    ^

I've seen other errors like Use of unsupported opcode (...) found including CONTAINS_OP and similar ones. The versions I'm using are the following:

Versions:
Python 3.9
numba 0.51.2
numpy 1.19.4
pandas 1.1.4
vectorbt 0.14.4
llvmlite 0.34.0
llvm 10.0.1-3

If there are recommended versions then please state them, so I could try to run it in the docker container.

P. S. Thank you for your work, as the person who wrote a much smaller and limited version of the backtesting engine than VectorBT I acknowledge the amount of work it took.

Support for shorting

Hi,
Great job. I would just like to suggest that it would be nice to do have a shorting feature, I have been unable to find out how it would get implemented in a nice way.

Trading schedule

Is it possible to have some sort of trading schedule in vectorbt?
Lets say that I want to trade only 8 hours a day and close my positions at the end of the day.
Is it possible to write a strategy that will close positions at the end of the trading day? I dont think it’s currently possible with vectorbt but just wanted to make sure.

Portfolio mode providing price of an order

Awesome library and well done. I'm looking to use the library in a context where it's important that I can specify (for example) that entry signal was on the open price, but the exit signal was on a closed price. Or, sometimes, that entry was on a particular price (such as for a limit or stop order).

At the moment, when providing signals to calculate portfolio, all I can do is specify either True/False with a single price series, or size (+/-) with same price series. What I'd like is to be able to specify the specific price of the entry or exit. Thoughts?

example rand_exit_choice_nb can not run

Hi,

I find an example can not run:

https://github.com/polakowo/vectorbt/blob/master/vectorbt/signals/factory.py#L316

  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\signals\factory.py", line 61, in __init__
    IndicatorFactory.__init__(
TypeError: __init__() got an unexpected keyword argument 'in_output_settings'
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\signals\factory.py", line 61, in __init__
    IndicatorFactory.__init__(
TypeError: __init__() got an unexpected keyword argument 'param_settings'
Traceback (most recent call last):
  File "D:/test_vectorbt/demo_stop3.py", line 66, in <module>
    my_sig.rand_type_readable
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\indicators\factory.py", line 1181, in attr_readable
    return getattr(_self, attr_name).applymap(lambda x: '' if x == -1 else enum._fields[x])
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\frame.py", line 6944, in applymap
    return self.apply(infer)
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\frame.py", line 6878, in apply
    return op.get_result()
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\apply.py", line 186, in get_result
    return self.apply_standard()
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\apply.py", line 313, in apply_standard
    results, res_index = self.apply_series_generator()
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\apply.py", line 341, in apply_series_generator
    results[i] = self.f(v)
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\pandas\core\frame.py", line 6942, in infer
    return lib.map_infer(x.astype(object).values, func)
  File "pandas\_libs\lib.pyx", line 2329, in pandas._libs.lib.map_infer
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\indicators\factory.py", line 1181, in <lambda>
    return getattr(_self, attr_name).applymap(lambda x: '' if x == -1 else enum._fields[x])
TypeError: tuple indices must be integers or slices, not float

Profolio.stats() as a function

Hi, thanks for the new version of vbt. I am curious about how to given a pd.Series of return then get a table like Profolio.stats().

Running example in portfolio.base seems to show different numbers from what is shown there

Hi,

I was testing the code found at https://polakowo.io/vectorbt/docs/portfolio/base.html#example and can't get it to show the same numbers as the ones provided in the document. The version of vectorbt is 0.14.2.

I am getting for comb_portfolio.total_profit():

group
first    -49667.155920
second   -64422.909441

But the example in the documentation shows:

group
first     21793.882832
second     8333.660493
import numpy as np
import pandas as pd
from datetime import datetime
import yfinance as yf
import talib
import vectorbt as vbt
from vectorbt.portfolio.enums import InitCashMode


# Fetch price history
pairs = ['BTC-USD', 'ETH-USD', 'XRP-USD', 'BNB-USD', 'BCH-USD', 'LTC-USD']
start = datetime(2020, 1, 1)
end = datetime(2020, 9, 1)
pair_history = {p: yf.Ticker(p).history(start=start, end=end) for p in pairs}

# Put assets into a single dataframe by price type
price = {}
for pt in ['Open', 'High', 'Low', 'Close']:
     price[pt] = pd.DataFrame({p: df[pt] for p, df in pair_history.items()})

# Run every single pattern recognition indicator and combine results
result = pd.DataFrame.vbt.empty_like(price['Open'], fill_value=0.)
for pattern in talib.get_function_groups()['Pattern Recognition']:
    PRecognizer = vbt.IndicatorFactory.from_talib(pattern)
    pr = PRecognizer.run(price['Open'], price['High'], price['Low'], price['Close'])
    result = result + pr.integer

# Don't look into future
result = result.vbt.fshift(1)

# Treat each number as order value in USD
order_size = result / price['Open']

# Simulate combined portfolio
group_by = pd.Index([
    'first', 'first', 'first',
    'second', 'second', 'second'
], name='group')
comb_portfolio = vbt.Portfolio.from_orders(
    price['Close'], order_size, order_price=price['Open'],
    init_cash=InitCashMode.AutoAlign, fees=0.001, slippage=0.001,
    group_by=group_by, cash_sharing=True
)

# Get total profit per group
comb_portfolio.total_profit()

Backtest strategy with prices and signs

I have tried to use the package today but it's hard to find a proper function for the problem.

I have simple set up. I am investing in SPY with a minute resolution. I have already calculate signs using ML methods. Now, I would like to backtest my strategy. So, I have a simple data frame with two columns: prices (or returns) and sings (1 and 0, where 1 is hold and 0 do not hold). Is there some simple function to backtest this kind of strategy. S we have price or return vector and signs vector.

Trailing Stoploss & Take-Profits

First of all, well done with this package! Looks great and very much looking forward playing with it.

With regards to the vectorized nature of the backtest, how would you generate the sell signals for trialing SL & TP for the backtest?

Thanks!

Support iterative take profit or loss signals

Hi,

I’m looking for a way to combine take profit and take loss signals when iteratively is true.
Sp you could make your exit signal take profit OR take loss. (Exit if profit > x or loss is > y)
Presently both functions return entry and exit objects so it seems like combining them would not work.

Returning None from order_func_nb causes numba TypingError

Error:

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Failed in nopython mode pipeline (step: nopython frontend)
Unknown attribute 'size' of type none

Code (taken from example with modified order_func_nb)

import vectorbt as vbt

from vectorbt.portfolio.enums import SizeType, AccumulateExitMode, ConflictMode

import numpy as np
import pandas as pd
from datetime import datetime, timedelta
from numba import njit, f8, i8, b1, optional
from datetime import datetime

price = pd.Series([1, 2, 3, 2, 1], index=pd.Index([
    datetime(2020, 1, 1),
    datetime(2020, 1, 2),
    datetime(2020, 1, 3),
    datetime(2020, 1, 4),
    datetime(2020, 1, 5)
]))
entry_price = price * 0.9
exit_price = price * 1.1


@njit
def order_func_nb(order_context, price, fees, fixed_fees, slippage):
    # i = order_context.i
    # col = order_context.col
    # size = col + 1
    # if i % 2 == 1:
    #     size *= -1
    # return vbt.portfolio.nb.Order(
    #     size, SizeType.Shares, price[i, col], fees[i, col], fixed_fees[i, col], slippage[i, col])    
    return None

portfolio = vbt.Portfolio.from_order_func(
    price, 
    order_func_nb, 
    price.values[:, None],
    np.full(price.shape, 0.01)[:, None],
    np.full(price.shape, 1)[:, None],
    np.full(price.shape, 0.01)[:, None]
)

call talib within numba

Following numba docs, this example with ADX appears to be working.

the cython version exposes only the relevant inputs, and does some checks so it's not 1:1.

I don't think the cython version can be called directly because it wraps the C code.

Null elements appear at the end because before calling the indicator function the lookback period should be computed (by another ta_lib function, which this example is lacking) and the out pointer passed to the indicator function should be shifted accordingly

sooo..it's not just an import away :(

from numba import jit
import numba as nb
import numpy as np
from ctypes import *
import ctypes
import talib.abstract as ta

ta_lib_path = "/usr/local/lib/libta_lib.so"

talib = CDLL(ta_lib_path)

np.random.seed(42)
arr = np.random.uniform(0, 1, 100)

beg = np.full(1, 0).ctypes.data_as(c_void_p)
nbe = np.full(1, 0).ctypes.data_as(c_void_p)

high = arr.copy().ctypes.data_as(c_void_p)
low = high
close = high
np_high = arr.copy()
np_low = np_high
np_close = np_high

np_out = np.full(arr.shape[0], np.nan)
out = np_out.ctypes.data_as(c_void_p)
window = 20
start, end = 0, 99
c_window = c_int64(window)
c_start, c_end = c_int64(start), c_int64(end)

adx_fn = talib.TA_ADX
adx_fn.argtypes = [
    c_int64,
    c_int64,
    c_void_p,
    c_void_p,
    c_void_p,
    c_int64,
    c_void_p,
    c_void_p,
    c_void_p,
]
adx_fn.restype = c_int64

adx_fn_type = ctypes.CFUNCTYPE(adx_fn.restype, *adx_fn.argtypes)(adx_fn)

@nb.extending.intrinsic
def address_as_void_pointer(typingctx, src):
    """ returns a void pointer from a given memory address """
    from numba.core import types, cgutils

    sig = types.voidptr(src)

    def codegen(cgctx, builder, sig, args):
        return builder.inttoptr(args[0], cgutils.voidptr_t)

    return sig, codegen

@nb.njit
def ADX_jit(start, end, high_addr, low_addr, close_addr, window, beg, nbe, out_addr):
    high = address_as_void_pointer(high_addr)
    low = address_as_void_pointer(low_addr)
    close = address_as_void_pointer(close_addr)
    out = address_as_void_pointer(out_addr)
    return adx_fn_type(start, end, high, low, close, window, beg, nbe, out)

def ADX_cython(x):
    for _ in range(x):
        np_out[:] = ta.ADX(arr, arr, arr, timeperiod=window)
        
def ADX_ctypes(x):
    for _ in range(x):
        adx_fn_type(c_start, c_end, high, low, close, c_window, beg, nbe, out)

@nb.njit
def ADX_nb(x, start, end,np_high, np_low, np_close, window, np_out):
    for _ in range(x):
        ADX_jit(
            start,
            end,
            np_high.ctypes.data,
            np_low.ctypes.data,
            np_close.ctypes.data,
            window,
            np.full(1, 0).ctypes.data,
            np.full(1, 0).ctypes.data,
            np_out.ctypes.data,
        )
def ADX_numba(x):
    ADX_nb(x, start, end, np_high, np_low, np_close, window, np_out)

# x = 1
# print(np_out)
# ADX_cython(x)
# print(np_out)
# np_out[:] = np.nan
# ADX_jit(start, end, np_high.ctypes.data, np_low.ctypes.data, np_close.ctypes.data, window, np.full(1, 0).ctypes.data, np.full(1, 0).ctypes.data, np_out.ctypes.data,)
# print(np_out)

x = 100000
%timeit ADX_numba(x)
%timeit ADX_ctypes(x)
%timeit ADX_cython(x)
267 ms ± 15.6 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
419 ms ± 10.8 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
1.15 s ± 24.5 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

How to use val_price?

Hi, Thank you for your job.

I find these are 3 paramters to set price: close/price/val_price, how to use them to trade futures?

for example: value = price*100

import numpy as np
import pandas as pd
import vectorbt as vbt

close = pd.Series([1, 2, 3, 4, 5])
val_price = close * 100
size = [1, np.nan, np.nan, np.nan, np.nan]
portfolio = vbt.Portfolio.from_orders(close, size,
                                      val_price=val_price, close_first=True)
#
print(portfolio.positions().records)

#
print(portfolio.value())

the output is:

   id  col  size  entry_idx  ...  pnl  return  direction  status
0   0    0   1.0          0  ...  4.0     4.0          0       0

[1 rows x 13 columns]
0    100.0
1    101.0
2    102.0
3    103.0
4    104.0
dtype: float64

I think the pnl should be 400.0
the value should be 100,200,300,400

Unexpected result from vbt.Portfolio.from_signals

From In [16] in /tests/notebooks/portfolio.ipynb

portfolio = vbt.Portfolio.from_signals(
    price, 
    entries=[True, False, True, True, True],
    exits=[False, True, False, False, True],
    size=[1, 2, 3, 4, 5], 
    size_type=SizeType.Cash, accumulate=False)
print(portfolio.cash)

the output is

2020-01-01     99.0
2020-01-02    101.0
2020-01-03     98.0
2020-01-04     98.0
2020-01-05      5.0
dtype: float64

Notice that the last row (cash in hand) is 5. It should ideally be 98 since both entry and exit signals are true.

Additionally, when both signals are true, in signals_order_func_nb

if size_type == SizeType.Shares:
            order_size = abs(size) - run_shares

will also give an unexpected result when accumulate is True and there have been multiple buys before this condition.

I think it makes most sense to do nothing when both signals are True.

`'frozenset' object has no attribute 'add'` when importing vectorbt

I'm using vectorbt 0.15.3 with python 3.6.8.

When I run import vectorbt, I got the following error:

>>> import vectorbt
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/my_name/.conda/envs/my_env_name/lib/python3.6/site-packages/vectorbt/__init__.py", line 519, in <module>
    import vectorbt.signals.accessors
  File "/Users/my_name/.conda/envs/my_env_name/lib/python3.6/site-packages/vectorbt/signals/accessors.py", line 873, in <module>
    class Signals_SRAccessor(Signals_Accessor, Generic_SRAccessor):
  File "/Users/my_name/.conda/envs/my_env_name/lib/python3.6/site-packages/pandas/core/accessor.py", line 190, in decorator
    cls._accessors.add(name)
AttributeError: 'frozenset' object has no attribute 'add'

Has anyone encountered the same problem?

How can I fix it?

The version of my pandas is 0.24.2.

why the first order is not executed with SizeType TargetValue

When using SizeType TargetValue the first order is skipped, how come? Isn't shares_now the total holdings of the oc.col asset at oc.i time?

from vectorbt.portfolio.enums import OrderContext, SizeType, NoOrder
from vectorbt.portfolio.nb import create_order_nb
import vectorbt as vbt
import numba as nb

@nb.njit
def order_func_nb_target_value(oc: OrderContext):
    if oc.i > 0:
        assert oc.shares_now > 0
    if oc.shares_now == 0:
        return create_order_nb(
            size=10,
            size_type=SizeType.TargetValue,
            price=oc.close[oc.i, oc.col]
        )
    else:
        return NoOrder
    
@nb.njit
def order_func_nb_target_shares(oc: OrderContext):
    if oc.i > 0:
        assert oc.shares_now > 0
    if oc.shares_now == 0:
        return create_order_nb(
            size=10,
            price=oc.close[oc.i, oc.col]
        )
    else:
        return NoOrder

close = np.random.randint(10, 20, (10, 1))
pf = vbt.Portfolio.from_order_func(
    close,
    order_func_nb_target_value,
    init_cash=100,
    cash_sharing=True,
    group_by=True,
)

pandas register_accessor error

Hi, looking forward to trying this package out. Installed via pip in a fresh conda environment but on import I get the following error.

Traceback (most recent call last): File "/home/ltorres/projects/alpaca-autotrader/vectorbt_example.py", line 1, in <module> import vectorbt as vbt File "/home/ltorres/anaconda3/envs/alpaca/lib/python3.6/site-packages/vectorbt/__init__.py", line 471, in <module> import vectorbt.root_accessors File "/home/ltorres/anaconda3/envs/alpaca/lib/python3.6/site-packages/vectorbt/root_accessors.py", line 30, in <module> from pandas.core.accessor import _register_accessor, DirNamesMixin ImportError: cannot import name '_register_accessor'

Tried downgrading pandas, but didn't help. Any idea why I'm getting this?

Thanks in advance.

nanmax_nb(a) does the same as np.nanmax(a, axis=0) but slower

ndarr = np.random.randint(0,10000,size=(1000, 1000))
%timeit nd1 = vbt.nb.nanmax_nb(ndarr)
%timeit nd2 = np.nanmax(ndarr, axis=0)
np.array_equal(nd1, nd2)

Output:
809 µs ± 3.07 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
651 µs ± 1.61 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
True

Probably same for nanmean_nb etc.

Feature request: Margin trades

I'm testing a strategy for a single stock where I buy more shares or sell some of the shares. I've set the init_capital to 1000000 so that all buy orders are recorded. In this scenario, the stats are not very helpful because most of the funds are not really being used.

It would be great to be able to specify margin limit (or simply assume infinity) and interest

Returns_Accessor

Is there a generic way to create a new method for these performance metrics, kind of how indicators has an indicator factory? Would be nice to be able to add these performance metrics and a generic way to implement new ones and add to the portfolio class.

I've only started implementing the examples so I might not have dug deep enough to find how to do this yet

ndexError: boolean index did not match indexed

Many times with some datasets I get this error, I can't understand why.

portfolio = vbt.Portfolio.from_signals(price, data["entries"], data["exits"], freq='D') 
print("Total return: ", portfolio.total_return())
print(portfolio.stats())
portfolio.plot().show()

The error:

File "vectorbt_.py", line 84, in <module>
    portfolio.plot().show()
...
 x=self_col.wrapper.index[exit_idx[closed_profit_mask]],
IndexError: boolean index did not match indexed array along dimension 0; dimension is 357 but corresponding boolean dimension is 360

returns_stats throw TypeError

Hi,

import numpy as np
import pandas as pd
import vectorbt as vbt

close = pd.Series([1, 2, 3, 4, 5])
val_price = close
size = [1, np.nan, np.nan, np.nan, -1]
portfolio = vbt.Portfolio.from_orders(close, size,
                                      val_price=val_price,
                                      close_first=True,
                                      freq='1D')

#
print(portfolio.stats())
#
print(portfolio.returns_stats())
D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\utils\datetime.py:23: RuntimeWarning: invalid value encountered in multiply
  return obj * freq_delta(freq)
Traceback (most recent call last):
  File "D:/test_vectorbt/demo_from_orders_3.py", line 16, in <module>
    print(portfolio.returns_stats())
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\portfolio\base.py", line 1980, in returns_stats
    stats_obj = returns.vbt.returns.stats(**kwargs)
  File "D:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\returns\accessors.py", line 343, in stats
    'Duration': self.wrapper.shape[0] * self.wrapper.freq,
TypeError: unsupported operand type(s) for *: 'int' and 'NoneType'

the portfolio.stats() can get Duration 5 days 00:00:00,
but the portfolio.returns_stats() can not

ADVSTEX with limit entry orders?

Great package, been trying it for about a month and I've found it to be very flexible!

I've been using ADVSTEX and there doesn't seem to be an obvious way to define limit entry orders, and subsequently, filled entries. The class seems to assume that entries are filled order entries or market bought.

The entries I have built for ADVSTEX are signal based entries, but I would like to move towards the above direction or being able to reflect limit order entries that are filled or canceled (time-based).

Briefly, I think I can rudimentarily get around this by running the entries through a custom indicator to reflect only filled entry order, but that may not easily account for orders that are never filled. It seems that this package can benefit from an integrated function within ADVSTEX itself. Would like to hear your thoughts on this.

multi-asset

Hi,

Thanks for sharing this library! It looks promising. Is it possible to backtest multi-asset strategies? If not, is it something you are considering?

Best

portfolio.iloc crash if num_tests is too big

Hi,

I test the PortfolioOptimization.ipynb, always crash at print(rb_portfolio.iloc[rb_best_asset_group].stats()),
so I clone it to pycharm, get error Process finished with exit code -1073741819 (0xC0000005)

and is will crash in portfolio.base._indexing_func

new_order_records = self._orders._col_idxs_records(col_idxs)

https://github.com/polakowo/vectorbt/blob/master/vectorbt/portfolio/base.py#L435

new_records_arr = nb.record_col_map_select_nb(
                self.values, self.col_mapper.col_map, to_1d(col_idxs))

https://github.com/polakowo/vectorbt/blob/master/vectorbt/records/base.py#L299

if not crash, make num_tests big then test again.

Do you have some advise to avoid that?

import os
import numpy as np
import pandas as pd
import yfinance as yf
from datetime import datetime
# os.environ['NUMBA_DISABLE_JIT'] = '1'  # uncomment this if you want to use pypfopt within simulation
from numba import njit, jit

import vectorbt as vbt
from vectorbt.generic.nb import nanmean_nb
from vectorbt.portfolio.nb import create_order_nb, auto_call_seq_ctx_nb
from vectorbt.portfolio.enums import SizeType, Direction

# Define params
assets = ['FB', 'AMZN', 'NFLX', 'GOOG', 'AAPL']
start_date = datetime(2017, 1, 1)
end_date = datetime.now()
num_tests = 2000

vbt.settings.returns['year_freq'] = '252 days'

# # Download data
# asset_price = pd.DataFrame({
#     s: yf.Ticker(s).history(start=start_date, end=end_date)['Close']
#     for s in assets
# }, columns=pd.Index(assets, name='asset'))

# print(asset_price.shape)

# asset_price.to_pickle('asset_price.pkl')
asset_price = pd.read_pickle('asset_price.pkl')

np.random.seed(42)

# Generate random weights, n times
weights = []
for i in range(num_tests):
    w = np.random.random_sample(len(assets))
    w = w / np.sum(w)
    weights.append(w)

print(len(weights))

_asset_price = asset_price.vbt.tile(num_tests, keys=pd.Index(np.arange(num_tests), name='asset_group'))
_asset_price = _asset_price.vbt.stack_index(pd.Index(np.concatenate(weights), name='weights'))

print(_asset_price.columns)

# Select the first index of each month
rb_mask = ~_asset_price.index.to_period('m').duplicated()

print(rb_mask.sum())

rb_size = np.full_like(_asset_price, np.nan)
rb_size[rb_mask, :] = np.concatenate(weights)  # allocate at mask

print(rb_size.shape)

rb_portfolio = vbt.Portfolio.from_orders(
    close=_asset_price,
    size=rb_size,
    size_type='targetpercent',
    group_by='asset_group',
    cash_sharing=True,
    call_seq='auto',  # important: sell before buy
    freq='D',
    incl_unrealized=True
)

print(len(rb_portfolio.orders()))
rb_portfolio.iloc[0]

talib run_combs seems to return incorrect data unless speed_up is set to True

Hi,
Observing (with latest version and the one before) that doing something like this:

        SMA = vbt.IndicatorFactory.from_talib('SMA')
        fast_ma, slow_ma = SMA.run_combs(
            ohlcv_wbuf['Open'],
            timeperiod=np.arange(2, 99 + 1, step=1),
            r=2,
            speed_up=True
        )

        fast_ma = fast_ma[self.wobuf_mask]
        slow_ma = slow_ma[self.wobuf_mask]

        dmac_entries = fast_ma.real_above(slow_ma, crossed=True)
        dmac_exits = fast_ma.real_below(slow_ma, crossed=True)
        dmac_portfolio = vbt.Portfolio.from_signals(ohlcv['Open'], dmac_entries, dmac_exits, freq='1M')
        metric = 'total_return'
        dmac_perf = getattr(dmac_portfolio, metric)
        dmac_perf_matrix = dmac_perf.vbt.unstack_to_df(symmetric=True,
                                                       index_levels='fast_ma_timeperiod', column_levels='slow_ma_timeperiod')

        dmac_perf_matrix = dmac_perf_matrix.vbt.heatmap(
            xaxis_title='Slow window',
            yaxis_title='Fast window',
            width=600, height=450)

Results in a heatmap like this: https://ibb.co/qRzyFjj

Using the MA.run_combs method or using speed_up=False in the one above results in the correct one: https://ibb.co/PQsF3gB

ypeError: No matching definition for argument type(s) array(float64, 2d, C), array(int32, 1d, C), array(bool, 1d, C)

I am trying to reproduce example from readme. I get error in this line

# Generate signals
fast_ma, slow_ma = vbt.MA.from_combinations(price, windows, 2)

The error message is:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-15-9abe364754ce> in <module>
      1 # Generate signals
----> 2 fast_ma, slow_ma = vbt.MA.from_combinations(price, windows, 2)

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\indicators\indicators.py in from_combinations(cls, ts, windows, r, ewm, names, **kwargs)
    202             names = ['ma' + str(i+1) for i in range(r)]
    203         windows, ewm = reshape_fns.broadcast(windows, ewm, writeable=True)
--> 204         cache_dict = cls.from_params(ts, windows, ewm=ewm, return_cache=True, **kwargs)
    205         param_lists = zip(*itertools.combinations(zip(windows, ewm), r))
    206         mas = []

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\indicators\indicators.py in from_params(cls, ts, window, ewm, **kwargs)
     98             ```
     99         """
--> 100         return super().from_params(ts, window, ewm, **kwargs)
    101 
    102     @classmethod

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\indicators\factory.py in from_params(cls, name, return_raw, *args, **kwargs)
    614             results = from_params_pipeline(
    615                 ts_list, param_list, level_names, len(output_names),
--> 616                 custom_func, *new_args, pass_lists=pass_lists, return_raw=return_raw, **kwargs)
    617             if return_raw or kwargs.get('return_cache', False):
    618                 return results

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\indicators\factory.py in from_params_pipeline(ts_list, param_list, level_names, num_outputs, custom_func, pass_lists, param_product, broadcast_kwargs, return_raw, *args, **kwargs)
    405     # Perform main calculation
    406     if pass_lists:
--> 407         output_list = custom_func(ts_list, param_list, *args, **kwargs)
    408     else:
    409         output_list = custom_func(*ts_list, *param_list, *args, **kwargs)

C:\ProgramData\Anaconda3\lib\site-packages\vectorbt\indicators\factory.py in custom_func(ts_list, param_list, return_cache, cache, *args)
    778                 # Caching
    779                 if cache is None and caching_func is not None:
--> 780                     cache = caching_func(*typed_ts_list, *param_list, *args)
    781                 if return_cache:
    782                     return cache

~\AppData\Roaming\Python\Python37\site-packages\numba\dispatcher.py in _explain_matching_error(self, *args, **kws)
    572         msg = ("No matching definition for argument type(s) %s"
    573                % ', '.join(map(str, args)))
--> 574         raise TypeError(msg)
    575 
    576     def _search_new_conversions(self, *args, **kws):

TypeError: No matching definition for argument type(s) array(float64, 2d, C), array(int32, 1d, C), array(bool, 1d, C)

I haven't change anything in the code.
Looks like numba error?

stop_type_readable IndexError: tuple index out of range

Hi,

Thanks for your job.

import vectorbt as vbt
import numpy as np
import pandas as pd
import yfinance as yf

prices = yf.Ticker("BTC-USD").history(period="max")

stops = np.arange(0, 1, 0.1)  # in %

entries = pd.DataFrame.vbt.signals.empty(prices['Open'].shape)
entries.iloc[0] = True

sl_advstex = vbt.ADVSTEX.run(
    entries,
    prices['Open'],
    prices['High'],
    prices['Low'],
    prices['Close'],
    sl_stop=list(stops),
    stop_type=None  # not needed, do not fill
)

# error 
print(sl_advstex.stop_type)
# IndexError: tuple index out of range
print(sl_advstex.stop_type_readable)

advstex_sl_stop 0.5 0.6 0.7 0.8 0.9
Date
2014-09-17 1081770405 -1610612736 1081649860 536870912 1081704796
2014-09-18 1081818116 0 1081766527 536870912 1081718493
2014-09-19 1081577177 -2147483648 1081573736 0 1081620181
2014-09-20 1081507397 -536870912 1081380519 1610612736 1081345593
2014-09-21 1081478135 0 1081527038 -1073741824 1081513418
... ... ... ... ... ...
2020-11-15 6 -609860213 7 1507580526 8
2020-11-16 7 -1315945876 5 706684395 7
2020-11-17 9 1183136307 9 665215337 8
2020-11-18 5 -292396138 6 -183980024 7

d:\Users\Kan\miniconda3\envs\py38_vectorbt\lib\site-packages\vectorbt\indicators\factory.py in (x)
1180 if _self.wrapper.ndim == 1:
1181 return getattr(_self, attr_name).map(lambda x: '' if x == -1 else enum._fields[x])
-> 1182 return getattr(_self, attr_name).applymap(lambda x: '' if x == -1 else enum._fields[x])
1183
1184 attr_readable.qualname = f'{CustomIndicator.name}.{attr_name}_readable'

IndexError: tuple index out of range

auto_call_seq_ctx_nb purpose

Hi! Thank you so much for this awesome package.

I was looking at the following example in portfolio.base's doc (pasted below) and was wondering what was the use of the call to auto_call_seq_ctx_nb. Could a nice soul shed light on this? Thank you so much!

>>> from vectorbt.portfolio.nb import auto_call_seq_ctx_nb
>>> from vectorbt.portfolio.enums import SizeType, Direction

>>> @njit
... def group_prep_func_nb(gc):
...     '''Define empty arrays for each group.'''
...     size = np.empty(gc.group_len, dtype=np.float_)
...     size_type = np.empty(gc.group_len, dtype=np.int_)
...     direction = np.empty(gc.group_len, dtype=np.int_)
...     temp_float_arr = np.empty(gc.group_len, dtype=np.float_)
...     return size, size_type, direction, temp_float_arr

>>> @njit
... def segment_prep_func_nb(sc, size, size_type, direction, temp_float_arr):
...     '''Perform rebalancing at each segment.'''
...     for k in range(sc.group_len):
...         col = sc.from_col + k
...         size[k] = 1 / sc.group_len
...         size_type[k] = SizeType.TargetPercent
...         direction[k] = Direction.LongOnly
...         sc.last_val_price[col] = sc.close[sc.i, col]
...     auto_call_seq_ctx_nb(sc, size, size_type, direction, temp_float_arr)
...     return size, size_type, direction

>>> @njit
... def order_func_nb(oc, size, size_type, direction, fees, fixed_fees, slippage):
...     '''Place an order.'''
...     col_i = oc.call_seq_now[oc.call_idx]
...     return create_order_nb(
...         size=size[col_i],
...         size_type=size_type[col_i],
...         price=oc.close[oc.i, oc.col],
...         fees=fees, fixed_fees=fixed_fees, slippage=slippage,
...         direction=direction[col_i]
...     )

>>> np.random.seed(42)
>>> close = np.random.uniform(1, 10, size=(5, 3))
>>> fees = 0.001
>>> fixed_fees = 1.
>>> slippage = 0.001

>>> portfolio = vbt.Portfolio.from_order_func(
...     close,  # acts both as reference and order price here
...     order_func_nb, fees, fixed_fees, slippage,  # order_args as *args
...     active_mask=2,  # rebalance every second tick
...     group_prep_func_nb=group_prep_func_nb,
...     segment_prep_func_nb=segment_prep_func_nb,
...     cash_sharing=True, group_by=True,  # one group with cash sharing
... )

>>> portfolio.holding_value(group_by=False).vbt.scatter()

Numba Error on Import

Hi, thanks for this awesome library! I tried running the example in a python notebook (Python 3.8.1) and got the following error on import. Any ideas?

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
Invalid use of Function(<function as_strided at 0x09D03A48>) with argument(s) of type(s): (array(float64, 1d, A), shape=UniTuple(int64 x 2), strides=UniTuple(int32 x 2))
 * parameterized
In definition 0:
    LoweringError: Failed in nopython mode pipeline (step: nopython mode backend)
Can only insert i32 at [0] in [2 x i32]: got i64

File "..\..\..\kl\lib\site-packages\numba\targets\arrayobj.py", line 5040:
    def as_strided_impl(x, shape=None, strides=None):
        x = reshape_unchecked(x, get_shape(x, shape), get_strides(x, strides))
        ^

[1] During: lowering "$22call_function.10 = call $2load_global.0(x, $12call_function.5, $20call_function.9, func=$2load_global.0, args=[Var(x, arrayobj.py:5040), Var($12call_function.5, arrayobj.py:5040), Var($20call_function.9, arrayobj.py:5040)], kws=(), vararg=None)" at c:\users\kliao\kl\lib\site-packages\numba\targets\arrayobj.py (5040)
    raised from c:\users\kliao\kl\lib\site-packages\numba\six.py:669
In definition 1:
    LoweringError: Failed in nopython mode pipeline (step: nopython mode backend)
Can only insert i32 at [0] in [2 x i32]: got i64

File "..\..\..\kl\lib\site-packages\numba\targets\arrayobj.py", line 5040:
    def as_strided_impl(x, shape=None, strides=None):
        x = reshape_unchecked(x, get_shape(x, shape), get_strides(x, strides))
        ^

[1] During: lowering "$22call_function.10 = call $2load_global.0(x, $12call_function.5, $20call_function.9, func=$2load_global.0, args=[Var(x, arrayobj.py:5040), Var($12call_function.5, arrayobj.py:5040), Var($20call_function.9, arrayobj.py:5040)], kws=(), vararg=None)" at c:\users\kliao\kl\lib\site-packages\numba\targets\arrayobj.py (5040)
    raised from c:\users\kliao\kl\lib\site-packages\numba\six.py:669
This error is usually caused by passing an argument of a type that is unsupported by the named function.
[1] During: resolving callee type: Function(<function as_strided at 0x09D03A48>)
[2] During: typing of call at c:\users\kliao\kl\lib\site-packages\vectorbt\timeseries.py (119)


File "..\..\..\kl\lib\site-packages\vectorbt\timeseries.py", line 119:
def _rolling_window_1d_nb(a, window):
    <source elided>
    strides = a.strides + (a.strides[-1],)
    return np.lib.stride_tricks.as_strided(a, shape=shape, strides=strides)
    ^

Simulate case where you're using pre/after market data but only trading during market hours

HI! I am exploring the following case and would just like to check with you if I am missing something obvious.

  1. I need to simulate trading activities when the market is open and NOT during the pre/after hour periods. (I am using minutely data so it’s between 9:30 and 4 pm EST)
  2. I still want to use the data of pre/after hours in my entry conditions.
    a. So: https://nbviewer.jupyter.org/github/polakowo/vectorbt/blob/master/examples/Trading-Sessions.ipynb does not seem to apply, since I don’t want to cut that data out, I just don’t want to simulate a trade then.

So far all I can think of is:

  1. Make a Boolean series which has Falses where you’re out of your desired time range and Trues when you’re in it.
  2. Use bitwise AND with entry/exits to make it so that your entry/exit conditions don’t occur during the times you don’t want them to.

Two issues with this:

  1. If I am running a bunch of combinations, I now need to do this to every single column in the dataframe.
  2. This doesn’t work on profit and profit_or_loss algos since they’re iterative. (My workaround right now for simple algorithms which go in on an indicator but go out at profit or loss is:
    a. FOR ENTRY: use bitwise and with a boolean series which maps to the trading interval I want.
    b. FOR EXIT: pass in data without that interval for signal generation. Then reindex the exit and new entries objects to fill in the missing pieces so that entries/exits and the dataset can all be broadcast to the same shape.

Both 1 and 2 carry significant performance impacts, of course.

I am assuming that this is another case for a custom numba function but it seems like then I would have to have one for every single indicator.

Example TypeError

raise TypeError(msg)
TypeError: No matching definition for argument type(s) array(float64, 2d, C), array(int32, 1d, C), array(bool, 1d, C)

import vectorbt as vbt
import numpy as np
import yfinance as yf

windows = np.arange(2, 101)

# Prepare data
ticker = yf.Ticker("BTC-USD")
price = ticker.history(period="max")['Close']

# Generate signals
fast_ma, slow_ma = vbt.MA.from_combinations(price, windows, 2)


price.head()
Out[66]: 
Date
2014-09-17   457.3300
2014-09-18   424.4400
2014-09-19   394.8000
2014-09-20   408.9000
2014-09-21   398.8200
Name: Close, dtype: float64


fast_ma, slow_ma = vbt.MA.from_combinations(price, windows, 2)
Traceback (most recent call last):
  File "C:\intelpython3\lib\site-packages\IPython\core\interactiveshell.py", line 3296, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-65-4beea761a4e4>", line 1, in <module>
    fast_ma, slow_ma = vbt.MA.from_combinations(price, windows, 2)
  File "C:\Users\Ben\Documents\python_packages\vectorbt\vectorbt\indicators\indicators.py", line 204, in from_combinations
    cache_dict = cls.from_params(ts, windows, ewm=ewm, return_cache=True, **kwargs)
  File "C:\Users\Ben\Documents\python_packages\vectorbt\vectorbt\indicators\indicators.py", line 100, in from_params
    return super().from_params(ts, window, ewm, **kwargs)
  File "C:\Users\Ben\Documents\python_packages\vectorbt\vectorbt\indicators\factory.py", line 616, in from_params
    custom_func, *new_args, pass_lists=pass_lists, return_raw=return_raw, **kwargs)
  File "C:\Users\Ben\Documents\python_packages\vectorbt\vectorbt\indicators\factory.py", line 407, in from_params_pipeline
    output_list = custom_func(ts_list, param_list, *args, **kwargs)
  File "C:\Users\Ben\Documents\python_packages\vectorbt\vectorbt\indicators\factory.py", line 780, in custom_func
    cache = caching_func(*typed_ts_list, *param_list, *args)
  File "C:\intelpython3\lib\site-packages\numba\dispatcher.py", line 574, in _explain_matching_error
    raise TypeError(msg)
TypeError: No matching definition for argument type(s) array(float64, 2d, C), array(int32, 1d, C), array(bool, 1d, C)

Windows7 64bit, python3.6
vectorbt-0.5
numpy version = 1.16.6

Possible issue with take_profit algos

Hi - please find an example file below.
What I am seeing is that when attempting to make a strategy with profit as my exit, the strategy still produces closed positions with losses.
vectorbt_profit_example.py.zip
From looking at what the signals object returns, it seems like the case where you have an entry, followed by multiple other entries, the profit exit condition is assessed from the very end of the chain.
Ex (part of the series produced by the example file): https://ibb.co/WnW43DS
So it looks like the trade is recorded as the first entry in the attached graph but exit is calculated from the last entry. Not sure if this is a bug or incorrect use of the library on my end.

Can't run the example from README.md on Python 3.9

I've got Python 3.9 x64 on Windows 10, and the following dependencies:

vectorbt==0.14.4
numpy==1.19.4
numba==0.51.2 #from binary
pandas==1.1.4

On running the example, I get the following output:

16:25:37 (master) Username->  python39 ./vectorbt_test.py
Traceback (most recent call last):
  File "vectorbt_test.py", line 9, in <module>
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\factory.py", line 14, in run_combs
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\factory.py", line 1578, in _run_combs
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\factory.py", line 1415, in _run
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\factory.py", line 763, in run_pipeline
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\factory.py", line 1762, in custom_func
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\numba\core\dispatcher.py", line 418, in _compile_for_args
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\numba\core\dispatcher.py", line 358, in error_rewrite
  File "C:\Users\username\AppData\Roaming\Python\Python39\site-packages\numba\core\utils.py", line 80, in reraise
numba.core.errors.UnsupportedError: Failed in nopython mode pipeline (step: analyzing bytecode)
Use of unsupported opcode (CONTAINS_OP) found

File "..\..\..\Users\username\AppData\Roaming\Python\Python39\site-packages\vectorbt\indicators\nb.py", line 38:
def ma_cache_nb(ts, windows, ewms):
    <source elided>
        h = hash((windows[i], ewms[i]))
        if h not in cache_dict:
        ^

the line its throwing an error on looks relatively innocent, https://github.com/polakowo/vectorbt/blob/master/vectorbt/indicators/nb.py#L38, but it seems like the nopython mode doesn't support the in keyword. I'm not familiar with numba though.

Is there a version of any of the dependencies that are recommended?

Generate list of all possible combinations

Is there an easy way to generate all possible combinations of parameters without going through the pipeline via run() or run_combs()? I'm looking for a fast way to get the possible combinations as a list of columns prior to selecting specific combinations to run.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.