Scripting Languages: Gluing Together Other Programs, ..
Scripting Languages: Gluing Together Other Programs, ..
originally tools for quick hacks, rapid prototyping, gluing together other programs, ... evolved into mainstream programming tools characteristics
text strings as basic (or only) data type regular expressions (maybe) built in associative arrays as a basic aggregate type minimal use of types, declarations, etc. usually interpreted instead of compiled
examples shell Awk Perl, PHP, Ruby, Python Tcl, Lua, ... Javascript, Actionscript Visual Basic, (VB|W|C)Script, PowerShell
it can be replaced by one you like better therefore there are lots of shells, reflecting history and preferences sh Bourne shell (Steve Bourne, Bell Labs -> ... -> El Dorado Ventures) csh C shell (Bill Joy, UC Berkeley -> Sun -> Kleiner Perkins) ksh Korn shell (Dave Korn, Bell Labs -> AT&T Labs)
combines programmability and interaction syntactically, superset of Bourne sh provides all csh interactive features + lots more mostly ksh + much of csh evolution of csh emphasizes running programs and programmability syntax derived from Algol 68
interaction: history, job control, command & filename completion, aliases more C-like syntax, but not as good for programming (at least historically)
variables, environment
rm '*' Careful !!! echo "It's now `date`" PATH=/bin:/usr/bin setenv PATH /bin:/usr/bin prog <in >out, prog >>out who | wc slow.1 | slow.2 & in ksh & bash in (t)csh
asynchronous operation
arguments can be passed to a shell file ($0, $1, etc.) if made executable, indistinguishable from compiled programs
Shell programming
the shell is a programming language
the earliest scripting language
filename expansion
if cmd; then cmds; elif cmds; else cmds; fi (sh) if (expr) cmds; else if (expr) cmds; else cmds; endif (csh)
while, for
for var in list; do commands; done (sh, ksh, bash) foreach var (list) commands; end (csh, tcsh)
Shell programming
shell programs are good for personal tools
tailoring environment abbreviating common operations
(aliases do the same)
gluing together existing programs into new ones prototyping sometimes for production use
e.g., configuration scripts
But:
shell is poor at arithmetic, editing macro processing is a mess quoting is a mess sometimes too slow can't get at some things that are really necessary
AWK
a language for pattern scanning and processing
Al Aho, Brian Kernighan, Peter Weinberger, at Bell Labs, ~1977
intended for simple data processing: selection, validation: "Print all lines longer than 80 characters"
length > 80
report generation: "Add up the numbers in the first field, then print the sum and average"
END { sum += $1 } { print sum, sum/NR }
"pattern" is a regular expression, numeric expression, string expression or combination of these "action" is executable code, similar to C usage: awk 'program' [ file1 file2 ... ] awk -f progfile [ file1 file2 ... ] operation: for each file for each input line for each pattern if pattern matches input line do the action
AWK features:
input is read automatically across multiple files
lines are split into fields ($1, ..., $NF; $0 for whole line)
associative arrays (arbitrary subscripts) regular expressions (like egrep) control flow statements similar to C: if-else, while, for, do built-in and user-defined functions
arithmetic, string, regular expression, text edit, ...
printf for formatted output getline for input from files or processes
precede each line by line number replace first field by line number print field 2, then field 1 flip $1, $2 zap field 2 print last field print non-empty lines print if more than 4 fields print if last field greater than 4 print matching lines (egrep) print lines where first field matches
{ nc += length($0) + 1; nw += NF }
END { print NR, "lines", nw, "words", nc, "characters" } length($0) > max { max = length($0); line = $0 } END { print max, line } print longest line
Control flow
if-else, while, for, do...while, break, continue
as in C, but no switch
go through each subscript of an associative array start next iteration of main loop leave main loop, go to END block
{ sum = 0 for (i = 1; i <= NF; i++) sum += $i print sum } { for (i = 1; i <= NF; i++) sum += $i } END { print sum }
Arrays
common case: array subscripts are integers reverse a file:
{ x[NR] = $0 } # put each line into array x { for (i = NR; i > 0; i--) print x[i] }
END
Associative Arrays
array subscripts can have any value, not just integers canonical example: adding up name-value pairs input:
pizza beer pizza beer pizza beer 200 100 500 50 700 150
output:
program: { amount[$1] += $2 } END { for (name in amount) print name, amount[name] | "sort +1 -nr" }
Anatomy of a compiler
input lexical analysis tokens syntax analysis intermediate form code generation object file linking a.out output symbol table
input data
Anatomy of an interpreter
input lexical analysis tokens syntax analysis intermediate form input data execution output symbol table
converts a grammar and semantic actions into a parser for that grammar
converts regular expressions for tokens into a lexical analyzer that recognizes those tokens
parser calls lexer each time it needs another input token lexer returns a token and its lexical type when to think of using them:
real grammatical structures (e.g., recursively defined) complicated lexical structures rapid development time is important language design might change
YACC overview
YACC converts grammar rules & semantic actions into parsing fcn yyparse()
yyparse parses programs written in that grammar, performs semantic actions as grammatical constructs are recognized
an interpreter could
run directly from the program (TCL, shells) interpret directly from the tree (AWK, Perl?):
at each node, interpret children (recursion), do operation of node itself, return result
generate byte code output to run elsewhere (Java) generate byte code (Python, ) generate C to be compiled later
YACC creates a parser from this when the parser runs, it creates a parse tree a compiler walks the tree to generate code an interpreter walks the tree to execute it
statement : IF ( expression ) statement create node(IF, expr, stmt, 0) IF ( expression ) statement ELSE statement create node(IF, expr, stmt1, stmt2) WHILE (expression ) statement create node(WHILE, expr, stmt) variable = expression create node(ASSIGN, var, expr) expression : expression + expression expression - expression ...
stmt: | while {inloop++;} stmt | if stmt else stmt | if stmt | lbrace stmtlist rbrace
{--inloop; $$ = stat2(WHILE,$1,$3);} { $$ = stat3(IF, $1, $2, $4); } { $$ = stat3(IF, $1, $2, NIL); } { $$ = $2; }
{ $$ = notnull($3); }
([0-9]+(\.?)[0-9]*|\.[0-9]+)([eE](\+|-)?[0-9]+)? { yylval.cp = setsymtab(yytext, tostring(yytext), atof(yytext), CON|NUM, symtab); RET(NUMBER); } while for do if else return { { { { { { RET(WHILE); } RET(FOR); } RET(DO); } RET(IF); } RET(ELSE); } if (!infunc) ERROR "return not in function" SYNTAX; RET(RETURN);
a.out
illustrates
little languages programs that write programs mechanization
Unit testing
code that exercises/tests small area of functionality helps make sure that code works and stays working very often used in "the real world"
e.g., JUnit single method, function, ... make sure small local things work so can build larger things on top e.g., can't check in code unless has tests and passes them
struct { int yesno; char *re; char *text; } tests[100] = { 1, "x", "x", 0, "x", "y", 0, 0, 0 }; main() { for (int i = 0; tests[i].re != 0; i++) { if (match(tests[i].re, tests[i].text) != tests[i].yesno) printf("%d failed: %d [%s] [%s]\n", i, tests[i].yesno, tests[i].re, tests[i].text); } }
Lessons
people use tools in unexpected, perverse ways
compiler writing: implementing languages and other tools object language (programs generate Awk) first programming language
bugs last forever "One thing [the language designer] should not do is to include untried ideas of his own."
(C. A. R. Hoare, Hints on Programming Language Design, 1973)