DEV Community

Cover image for Neural DSL v0.2.2 Release Notes
NeuralLang
NeuralLang

Posted on

1 1 1 2 1

Neural DSL v0.2.2 Release Notes

🚀 Major Changes

Fixed Parameter Parsing

Layer parameter handling has been significantly improved:

# Now correctly handles both styles:
Dense(64, "relu")     # Positional params
Dense(units=64, activation="relu") # Named params
Enter fullscreen mode Exit fullscreen mode

Validation Enhancements

  • Strict positive integer validation for critical parameters
# These will now raise clear validation errors:
Conv2D(filters=-32)  # ERROR: filters must be positive
Dense(units=0)       # ERROR: units must be positive
Enter fullscreen mode Exit fullscreen mode

Improved Error Messages

  • Added line/column information for better debugging
ERROR at line 4, column 32: Conv2D filters must be positive integer, got -32
Enter fullscreen mode Exit fullscreen mode

🛠️ Technical Improvements

Layer Parameter Processing

  • Unified parameter merging across layers:
    • Dense
    • LSTM
    • GRUCell
    • GaussianNoise

Grammar Refinements

  • Resolved token conflicts between:
    • NUMBER
    • FLOAT
    • INT
  • Simplified param_style1 rules

HPO Support Updates

# Now correctly supports:
HPO(choice(32, 64, 128))  # Units choice
HPO(choice("relu", "tanh"))  # Activation choice
Enter fullscreen mode Exit fullscreen mode

🐛 Bug Fixes

Layer-Specific Fixes

  • Fixed nested list flattening in GaussianNoise
  • Corrected STRING token regex for activation functions
  • Resolved VisitError wrapping issues

Macro System

  • Fixed parameter override logic during expansion
# Now correctly handles:
define MyBlock {
    Dense(64)
    Dropout(0.5)
}
MyBlock(units=128)  # Properly overrides Dense units
Enter fullscreen mode Exit fullscreen mode

🚧 Known Issues

  1. PyTorch Support: Limited layer support (work in progress)
  2. Macro Stability: Potential parser issues with nested layer blocks
  3. HPO Limitations: log_range() requires explicit integer casting

📝 Migration Guide

Updating from v0.2.1

# Old style (might fail):
network MyNet {
    layers: Dense("64")  # String number
}

# New style (recommended):
network MyNet {
    layers: Dense(64)    # Integer number
}
Enter fullscreen mode Exit fullscreen mode

🔜 Next Steps

  1. Complete PyTorch layer support
  2. Stabilize macro system
  3. Enhance HPO functionality

For full changelog, see CHANGELOG.md
For documentation, visit docs/

AWS Q Developer image

Your AI Code Assistant

Ask anything about your entire project, code and get answers and even architecture diagrams. Built to handle large projects, Amazon Q Developer works alongside you from idea to production code.

Start free in your IDE

Top comments (0)

Image of DataStax

AI Agents Made Easy with Langflow

Connect models, vector stores, memory and other AI building blocks with the click of a button to build and deploy AI-powered agents.

Get started for free