Great video. But, what can we do when our data is much larger than in the video example ? Is there any alternative way to increase the number of tokens per query?
@superlazycoder19847 ай бұрын
Hi Lauren maybe you can try some the large model for Google tapas or Microsoft tapex. I can see large models seem to have higher model_max_length
@superlazycoder19847 ай бұрын
You can also try chunking your table data and then feed it to model
@laurenttran82887 ай бұрын
@@superlazycoder1984 Yeah but it might lose some precision into the prediction no?
@superlazycoder19846 ай бұрын
Chunking might not lose precision and also using larger models