Sunday, April 19, 2026
banner
Top Selling Multipurpose WP Theme

6. All these factors taken collectively create an LTSM mannequin (LTSM Bundle) that outperforms all current strategies of reprogramming LLM for time collection and transformer-based time collection forecasting fashions.

Evaluate the bundle with current frameworks. Picture by writer.

Reprogram your LTSM your self!

Need to reprogram your individual LTSM? Here is a tutorial for the LTSM bundle: https://github.com/daochenzha/ltsm/blob/main/tutorial/README.md

Step 1: Create a digital setting. Clone and set up the necessities and repositories.

conda create -n ltsm python=3.8.0
conda activate ltsm
git clone git@github.com:daochenzha/ltsm.git
cd ltsm
pip3 set up -e .
pip3 set up -r necessities.txt

Step 2: Put together the dataset: Be sure your native knowledge folder appears to be like like this:

- ltsm/
- datasets/
DATA_1.csv/
DATA_2.csv/
DATA_3.csv/
...

Step 3: Generate time collection prompts from the coaching, validation, and check datasets

python3 prompt_generate_split.py

Step 4: Discover the generated time collection prompts within the ‘./prompt_data_split’ folder. Then run the next command to finalize the prompts.

# normalizing the prompts
python3 prompt_normalization_split.py --mode match

#export the prompts to the "./prompt_data_normalize_split" folder
python3 prompt_normalization_split.py --mode rework

Remaining step: Practice your individual LTSM utilizing Time Sequence Immediate and Linear Tokenization with gpt2-medium.

python3 main_ltsm.py 
--model LTSM
--model_name_or_path gpt2-medium
--train_epochs 500
--batch_size 10
--pred_len 96
--data_path "DATA_1.csv DATA_2.csv"
--test_data_path_list "DATA_3.csv"
--prompt_data_path "prompt_bank/prompt_data_normalize_split"
--freeze 0
--learning_rate 1e-3
--downsample_rate 20
--output_dir [Your_Output_Path]

For extra info, see our paper and GitHub repository.

paper: https://arxiv.org/pdf/2406.14045
code: https://github.com/daochenzha/ltsm/

reference:

[1] Liu, Pengfei, et al. “Pretraining, Prompting, and Predicting: A Systematic Investigation of Prompting Strategies in Pure Language Processing.” ACM Computing Survey 55.9(2023):1–35.

[2] Liu, Xiao, et al., “Self-supervised studying: generative or contrastive?” IEEE Transactions on Information and Information Engineering. 35.1(2021):857–876.

[3] Ansari, Abdul Fatir, et al., “Chronos: Studying the Language of Time Sequence.” arXiv preprint arXiv:2403.07815 (2024).

banner
Top Selling Multipurpose WP Theme

Converter

Top Selling Multipurpose WP Theme

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

banner
Top Selling Multipurpose WP Theme

Leave a Comment

banner
Top Selling Multipurpose WP Theme

Latest

Best selling

22000,00 $
16000,00 $
6500,00 $

Top rated

6500,00 $
22000,00 $
900000,00 $

Products

Knowledge Unleashed
Knowledge Unleashed

Welcome to Ivugangingo!

At Ivugangingo, we're passionate about delivering insightful content that empowers and informs our readers across a spectrum of crucial topics. Whether you're delving into the world of insurance, navigating the complexities of cryptocurrency, or seeking wellness tips in health and fitness, we've got you covered.