@DolphinDB

0 Followers
0 Following
10 Posts

In #DolphinDB we can import historical data into a stream table in chronological order as “real-time data” so that the same script can be used both for backtesting and real-time trading.

For more info, please check: https://medium.com/@DolphinDB_Inc/introduction-to-backtesting-strategy-historical-data-replay-in-dolphindb-497e24af596d.

Introduction to backtesting strategy: Historical data replay in DolphinDB

In DolphinDB, we can import historical data into a stream table in chronological order as “real-time data” so that the same script can be used both for backtesting and real-time trading. Regarding…

Medium

#DolphinDB offers a unified batch and stream processing solution for machine learning, enabling efficient feature generation from both historical and real-time data. By constructing 10-minute features from 3-second snapshot data, the solution generates 676 derived features and proves to be 30 times faster than Python.

More info, please visit: https://medium.com/@DolphinDB_Inc/feature-engineering-for-stock-volatility-prediction-the-unified-stream-and-batch-processing-21bc38651b4.

Feature Engineering for Stock Volatility Prediction: The Unified Stream and Batch Processing Framework in DolphinDB

This article introduces how to conduct feature engineering for model training and prediction in DolphinDB. Inspired by the 1st place solution of the time series prediction competition Kaggle Optiver…

Medium

Calculating transaction costs from tick data involves matching trade and NBBO tables, which have nanosecond-level timestamps that rarely align perfectly. #DolphinDB allows for complex, non-exact joins with just a single line of script, facilitating the calculation of transaction costs.

More info, please visit: https://medium.com/@DolphinDB_Inc/how-to-calculate-transaction-costs-in-dolphindb-228da46ee922.

Helpful Tools for Quant丨Efficiently Calculate Transaction Costs from Tick Data

The calculation of transaction costs from tick data often involves two tables: trade and nbbo. As the timestamps of both tables are at nanosecond level, there are virtually no exact match between the…

Medium

#DolphinDB’s powerful functions and parallel computing capabilities enable quick and accurate correlation calculations, enhancing investment strategies and risk management.

More info, please visit: https://medium.com/@DolphinDB_Inc/mastering-pairwise-correlations-calculation-of-securities-through-coding-e840c8e8260a

Mastering Pairwise Correlations Calculation of Securities Through Coding

Calculating pairwise correlations of securities sheds light on the relationship between different securities, assisting in investment decision-making and risk management. In this article, you’re…

Medium

#DolphinDB improves data analysis by optimizing query and aggregation calculations, offering fast and accurate data processing for large datasets. Through its distributed architecture and advanced techniques, #DolphinDB significantly accelerates data queries and aggregations, making it an essential tool for real-time data analysis across various industries.

More info, please visit: https://medium.com/@DolphinDB_Inc/accelerating-data-analysis-embracing-the-efficient-query-and-aggregation-calculations-c1dfb484d09b.

Accelerating Data Analysis: Embracing the Efficient Query and Aggregation Calculations

Efficient query and calculation capabilities play a pivotal role in handling large-scale datasets, and ensuring timely and accurate data analysis. In this article, we will explore how DolphinDB’s…

Medium

How to convert high-frequency signals into discrete buy/sell/hold signals?

It can be easily implemented in #DolphinDB with function iif!

More info, please visit: https://medium.com/@DolphinDB_Inc/high-frequency-data-analysis-converting-high-frequency-signals-to-discrete-buy-sell-signals-ea3146820424.

High Frequency Data Analysis: Converting High-frequency Signals to Discrete Buy/Sell Signals

In high-frequency trading, we generate high-frequency signals from trade and quote tick data and analyze these signals to identify trading opportunities. This tutorial demonstrates how to convert…

Medium

# DolphinDB exhibits outstanding performance in data downsampling due to the following reasons:
1. Jobs are executed distributedly and resources of different nodes can be utilized at the same time;
2. Compression reduces the disk I/O;
3. Columnar storage and vectorized computation improve the efficiency of aggregation.

More info, please visit: https://medium.com/@DolphinDB_Inc/how-to-downsample-your-data-efficiently-ef7125fb9a6.

How to Downsample Your Data Efficiently - DolphinDB - Medium

In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million minute-level records in only 41 seconds in DolphinDB. As the SQL query may involve…

Medium
A Simpler Way to Calculate WorldQuant 101 Alphas - DolphinDB - Medium

The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take the…

Medium

Based on the #DolphinDB’s streaming replay, distributed database architecture and APIs, you can create a powerful tool for historical insight and model back-testing, which allows you to review situations of interest and improve future performance.

More info, please visit: https://medium.com/@DolphinDB_Inc/best-practices-for-market-data-replay-e69fea702736.

Best Practices for Market Data Replay - DolphinDB - Medium

This tutorial introduces the practical implementation of the DolphinDB’s replay feature. Based on the DolphinDB’s streaming replay, distributed database architecture and APIs, you can create a…

Medium

DolphinDB’s multi-source market data integration solution significantly enhances the cryptocurrency market data acquisition process, providing traders with a superior framework for strategic trading decisions and activities.

More info, please visit: https://medium.com/@DolphinDB_Inc/integrating-crypto-market-data-from-multiple-sources-a92d5ef5f264.

Integrating Crypto Market Data From Multiple Sources

Cryptocurrency trading professionals typically use Python, Java, Rust, or C++ for exchange API access. Single-source data access presents inherent limitations, such as data loss from server downtime…

Medium