Skip to main content
Version: 1.2.0

Input Length Limits

  • CodeVista displays the input length of your prompt in real-time, along with the maximum input length allowed for the chosen AI model.

  • The input length is measured in tokens, which are roughly equivalent to words or characters, depending on the language.

  • Here are the maximum input length limits for different models:

    ModelMax Input Length (Tokens)
    GPT-3.5 4K1500
    GPT-3.5 16K, LLaMA 27500
    GPT-4, Google Codey, Google Gemini15500
  • If your prompt exceeds the input length limit, CodeVista will truncate the oldest messages from the context to accommodate the new input.

  • You can learn more about tokens and their calculation inย this article from OpenAI.

By understanding and effectively using the code context, combining predefined actions with context, and being mindful of input length limits, you can significantly enhance the quality and accuracy of the assistance provided by CodeVista.