Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Recently I've discovered that TCL is a very pleasant language for shell scripting. The command-based syntax and the "everything is a string" design are a perfect fit for that problem domain.

Next time that you have a shell script that is too complicated for bash but not complicated enough for Python, give TCL a try!



Any examples of this to share?

I think Tcl would suffer from the lack of syntax for pipelines, for example.


One of these days I'm going to write a blog post about it...

The built in "exec" command supports pipelines and file redirections with the usual syntax so you can do

    exec foo < bar | baz
and it works similarly to a bash command substitution

    $(exec foo < bar | baz)
The main catch is that if you have an argument that is "|" or "<" then TCL interprets it as part of a pipeline, even if it came from a variable. However, the flip side us that you don't need to worry about arguments with spaces or asterisks. There is no word splitting or glob expansion so it is OK to leave everything unquoted.

    set r "<"
    set p "|"
    exec foo $r bar $p baz
There are also a couple things that are slightly different from bash. One is that exec raises an exception if the command returns a nonzero exit code. If that is not what you want then you will need to use the TCL equivalent of try-catch. I wrote a small helper function for this common use case.

Another thing is that exec returns a string with the output of the command, similarly to $() in bash. If you want to output to stdout then you need to work around that.


Everything-is-a-string is a terrible design. Also I don't think there is much space in between Bash and Python. I generally recommend ditching Bash for Python as soon as you have any flow control.


Everything being a string is not that bad in a world of shell pipelines, where all the commands arguments and results are strings anyway. And if you want you can also use data structures like lists and dictionaries. It is a bit idiosyncratic to support the everything is a string metaphor but under the hood they are still implemented as actual lists or dictionaries.

The main thing I dislike about migrating my bash scripts to python is that the syntax is more verbose. subproccess.call can be a bit of a mothfull and you "need" "to" "quote" "everything", "like“ "this". If the only fancy thing your script is doing is control flow then Tcl might be a good fit.


The quoting thing is definitely a benefit. Bash's lack of quoting usually leads to bugs (god help you if you have a space in your path).

I agree subprocess.call is quite verbose, but I find you have to do it less in Python scripts anyway because you can things properly with libraries rather than hackily calling out to other programs (e.g. curl).


I work in a very polyglot environment, and doing "everything properly with libraries" would mean duplicating code from language 1 to language 2 and possibly language 3 and 4 over and over again for zero benefit. Far from being "hacky" to call out to other programs, let's write the program only once, and have easy ways to glue to results of different programs together regardless of what language they might have been written in.

I am just learning about tcl today and am seriously considering replacing bash with it. Lack of quoting is looking very elegantly done in tcl, because of the way it handles {} and...it's not a scope, it's a single string argument. You can manipulate the string argument as a string, and you ultimately still know exactly whether you have space separated arguments or a single string because the grouping is explicit.

Tcl does not do automatic expansion on glob patterns unless explicitly requested with `glob`. Where bash might end up dynamically changing a space separated argument into two arguments when it gets passed around, tcl will give you a type failure for the equivalent scenario because the arguments do not fit the command given.

The ease of metaprogramming feats is really feeling incredibly LISPy, but without all the verbosity that turns me off from LISP...and with easy access to any commands in the host shell. That means LISPy control over a polyglot language context.

Consider:

    puts "Hello world!"
    # Hello world!

    proc say {word msg} {puts "$word $msg!"}
    say Hello world
    # Hello world!

    set greeting "Hello"
    proc say$greeting {msg} {
        say [uplevel {set greeting}] $msg
    }
    sayHello world
    # Hello world!

    set pyScript {
        print "Python"
    }
    sayHello [exec python << $pyScript]
    # Hello Python!

    set nodeScript {
        console.log("Node")
    }
    sayHello [exec node << $nodeScript]
    # Hello Node!
I know this doesn't look this special, and basically just looks like doing normal bash stuff. But-

- there's no special handling you need to do with the scripts that need an outside interpreter,

- super easy to build these interactively within your tclsh, you can do macro-like actions on any of this very easily,

- you can send functions "from whatever language" around as arguments (and augment them) totally painlessly and not need to worry about syntax issues outside of the normal language context. You're either "inside" a UTF-8 encoded {} or you are not (in which case you are in the tcl context).


I agree. Python shines when you can use real libraries such as "requests" and if you are able to use a proper string/regex manipulation instead of being forced to live with sed/cut/awk. However, sometimes I really just want to invoke a bunch of shell commands, including pipelines. In those cases I gravitate towards Tcl or Fish.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: