Nushell support #296
Replies: 2 comments 1 reply
-
What would be needed to make it work? And out of curiosity, are you one of the maintainers or an enthusiast of it? |
Beta Was this translation helpful? Give feedback.
-
Nushell's bracket-based syntax is significantly more token-efficient for LLM development than indentation-sensitive languages. Working with Open AI Codex on Python: constant indentation edits. Every nested change reformats multiple lines, wasting tokens on whitespace instead of logic. Why Nushell works better:
Nushell is inherently more LLM-friendly by nature, as it allows writing single-word literals without quotes, while literals with spaces require quotes, and delimits list elements using spaces only, without commas. Token efficiency ranking:
For shell scripting with LLM assistance, Nushell's syntax is fundamentally more compatible with how these models operate. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hey guys! It would be really cool if Codex worked with Nushell.
Nushell is gaining momentum now. Just check how steadily it is growing: https://trends.google.com/trends/explore?hl=en-GB&tz=180&date=today+5-y&hl=en-GB&q=%2Fg%2F11lkrdp6pf&sni=3
I believe Nushell, in general, is a very good candidate for your reinforcement learning exercises.
Beta Was this translation helpful? Give feedback.
All reactions