Release Notes
New updates and improvements to the fastest code application system
Features
- Natural language codebase search with AI agent integration
- Native adapters for Anthropic, OpenAI, Gemini, and Vercel AI SDK
- Remote sandbox execution support (E2B, Modal, Daytona, Docker, SSH)
- Direct usage API for custom integrations
- Provider-agnostic Zod schemas for custom integrations
- Seamless integration with existing agent workflows
WarpGrep enables AI agents to utilize a sub-agent to search and navigate codebases using natural language queries, making it easier to build powerful coding assistants.
Features
- Increased streaming speed to 10,500+ tokens/second for large file applications
- Enhanced context window support for files up to 1M tokens
- Smart context reranking
- New batch processing API for multiple file updates
- Improved error recovery for partial applications
Bug Fixes
- Fixed edge case in merge conflict resolution
- Resolved streaming buffer issues for very large files
- Improved handling of malformed update snippets
This release focuses on speed and reliability for enterprise-scale code applications.
Features
- 🚀 Production-grade reliability with 99.9% uptime
- ⚡ 10,500+ tokens/second streaming performance
- 🔧 Support for all major programming languages and file formats
- 🎯 Intelligent diff detection and conflict resolution
- 📊 Real-time application metrics and monitoring
- 🔐 Enterprise security with SOC 2 compliance
- 🌐 Global edge deployment for minimal latency
Major milestone release bringing Morph Apply to production readiness.
Features
- New interactive playground with live preview
- One-click integration with Continue.dev
- Enhanced API documentation with interactive examples
- Real-time collaboration features for team workflows
- Custom model support for GPT-4, Claude, Gemini, and more
Bug Fixes
- Fixed authentication flow for team accounts
- Resolved API rate limiting edge cases
- Improved error messages for debugging
Features
- Native support for Claude 3.5 Sonnet, GPT-4o, and Gemini Pro
- Model-specific optimization for different code types
- Automatic model selection based on file type and size
- Parallel processing for multiple file updates
- Advanced caching system reducing API costs by 60%
Bug Fixes
- Fixed timeout issues for very large codebases
- Improved memory management for concurrent requests
- Resolved edge cases in TypeScript interface merging
Features
- Team collaboration with shared workspaces
- Advanced usage analytics and billing controls
- Custom deployment options for enterprise customers
- Webhook support for CI/CD integration
- Priority support with dedicated engineering resources
Features
- Smart context selection reducing hallucinations by 40%
- File-type specific merge strategies
- Syntax-aware conflict resolution
- Automatic backup and rollback capabilities
- Enhanced support for configuration files and documentation
Bug Fixes
- Fixed issues with nested function edits
- Improved handling of indentation and formatting
- Resolved conflicts in import statement merging
This release significantly improves edit accuracy and reduces the need for manual cleanup.
Features
- 🎯 Core apply functionality with 1000+ tokens/second
- 📝 Support for code, documentation, and configuration files
- 🔄 Real-time streaming of applied changes
- 🛡️ Safe merge algorithms preventing data loss
- 🔗 OpenAI-compatible API for easy integration
- 📚 Comprehensive documentation and examples
- 💻 Developer-friendly playground interface
Initial public release of Morph Apply - the fastest way to apply LLM-generated code updates.