Constrained Output Length Controller

U

@

·

Build an output length controller with estimation, prompt calibration, smart truncation, and expansion for precise length control.

48 copies0 forks
Implement a system for controlling LLM output length precisely.

## Length Requirements
{{length_requirements}}

## Content Priorities
{{content_priorities}}

## Quality Constraints
{{quality_constraints}}

Build the controller:

```python
class OutputLengthController:
    def estimate_output_length(self, prompt: str, task_type: str) -> int:
        """Predict expected output tokens"""
        pass
    
    def calibrate_prompt(self, prompt: str, target_tokens: int) -> str:
        """
        Add length guidance:
        - Explicit token limits
        - Structural constraints
        - Detail level instructions
        """
        pass
    
    def truncate_smartly(self, output: str, max_tokens: int) -> str:
        """
        Preserve:
        - Complete sentences
        - Key information
        - Structural integrity
        """
        pass
    
    def expand_to_minimum(self, output: str, min_tokens: int, context: str) -> str:
        """Expand output to meet minimum length"""
        pass
```

Include:
- Token counting accuracy
- Length distribution analysis
- Retry strategies

Details

Category

Coding

Use Cases

Length controlOutput formattingToken management

Works Best With

claude-sonnet-4-20250514gpt-4o
Created Shared

Create your own prompt vault and start sharing