Newsgroups: alt.os.multics,alt.sys.pdp10,alt.folklore.computers,comp.lang.lisp
Path: cantaloupe.srv.cs.cmu.edu!das-news2.harvard.edu!news2.near.net!howland.reston.ans.net!ix.netcom.com!netcom.com!vsocci
From: vsocci@netcom.com (Vance Socci)
Subject: Re: Retro-Computing!
Message-ID: <vsocciD6IuJ9.46J@netcom.com>
Sender: vsocci@netcom17.netcom.com
Organization: Xenos
X-Newsreader: News Xpress Version 1.0 Beta #2.1
References: <vsocciD6DErH.ADz@netcom.com> <3lmivt$ao@usenet.rpi.edu> <massagja-0304950924350001@sprawl.byu.edu> <D6HILL.GAJ@bonkers.taronga.com>
Date: Tue, 4 Apr 1995 14:46:08 GMT
Lines: 98

peter@bonkers.taronga.com (Peter da Silva) wrote:
>In article <massagja-0304950924350001@sprawl.byu.edu>,
>John Massaglia <massagja@pegate.byu.edu> wrote:
>>What makes you think that people write linearly because they think linearly?
>
>Does it matter? The dominant method for communicating complex concepts is
>now, and will for the forseeable future remain, writing. There's no benefit
>in attempting to design a programming methodology that isn't based in the
>largest part on linear text, except as an academic excersize.

You're too late - National Instruments and Hewlett Packard have both
issued commercial level graphical lanaguages.

I can only speak about the LabView language by National Instruments, since
I have no exposure to HPVL (HP Visual Language).

It is a real language: these days, it even compiles rather than interprets.

There is almost zero learning to be able to program in it. All you have to do
is to let go of the linear idea and understand the dataflow paradigm. There
is literally no syntax, except a tiny bit for the string parsing functions (which
look suspiciously like scanf scanning strings).

You can literally build user interfaces from a parts list of nifty controls, indicators,
text windows, etc.  Even graphs! You just wire them up and go.

The paradigm is that you're building an "instrument"; a device with an arbitrarily complex
control panel. There are ways to have multiple panels come up on top of
each other, and lots of other great features. All without having to study
a book full of interface calls.

I was skeptical at first, being an old-time assembly language  hacker. But after
I used it a few times for laboratory control applications, I was sold.

Its a natural way to express inherently multi-threaded system control and data
aquisition code. The language has its own embedded scheduler, and every function
is essentially a little task waiting to get its inputs satisfied so that it can emit
an output. All functions are one-shot: if you need repetetive action (which most
programs do) you just enclose the whole thing in a loop structure, which effectively
will sample the inputs to the loop and run the data through the network inside
the loop. The outputs will be successively emitted from the loop.

You can of course write to disk files, printers, whatever you like. Its a functionally
complete programming language. And its easy to write in; no burden of remembering
what name a certain function has in a particular language, for the most part. To find
a function, you bring up a pop-up of the families of functions, select a family, and
get your function. Very nice.

I'm afraid no description can do the experience justice; I've made real money
writing real applications in this language. National Instruments has poured a lot
of money and committment into it. A language is a *culture*; its not easy to
get one started, and its not easy to get rid of a popular one. National seems
to realize this, and has concentrated on the people that do modelling and
real-time data acquisition as targets for this language.

The performance is not as good as a hand-coded application; at least it wasn't
a couple of years ago which was the last time I used LabView. This is probably
because of the internal-task scheduling overhead. There are a few language features
it is missing also; but nothing that couldn't be easily added.

So I'm afraid its down to the blind-man and the sunset; its real, it works, and
I think its the future. To say there's no use in going beyond typewriter
languages is at best a self-fulfilling limitation.

Again: if typewriter languages were so great, why don't people design
hardware with them instead of using diagrams? Why not just try to come
up with text-formulas to describe the shape of a design instead of using
a CAD package?

Code in any langauge is just that: an encoding of the solution to a problem
heavily biased towards the engine which will execute the code. Our languages
look like they do because our machines are linear, and no one seems to have
the imagination and courage (guess I have to include myself in this accusation)
to design new architectures and get them to be popular. I'd rather see
some languages that are closer to describing the *problem*; then we can
generate solutions to the problem based on available "realware"/hardware.

Labview is like that; it releases the programmer from having to express themself
in terms of linear procedure. This results in a better match to actual real-time
problems. Sure, you can find an isomorphic solution in a linear language; the
proof of that is that LabView generates code for a linear machine. But if we
gave LabView an arbitrary number of processors with programmable interconnection
and a code-generator/scheduler to match, it would *instantly* be able to
take advantage of the added parallelism. No linear language has that ability
without forcing the programmer to adhere to artificial restrictions on coding.

[end of rant]



- Vance

/=======================================\
|    Vance Socci   vsocci@netcom.com	|
| "The worst secrets are those we keep	|
|   from ourselves . . ."		|
| "I am not a number; I am a free man!	|
\=======================================/
