View Single Post
Old 24 August 2015, 12:36   #302
Mrs Beanbag
Glastonbridge Software
Mrs Beanbag's Avatar
Join Date: Jan 2012
Location: Edinburgh/Scotland
Posts: 2,202
Originally Posted by idrougge View Post
Once you get used to curly braces (and learn to type it if you have a non-English keyboard), they may seem simple enough, but they also take some effort for the brain to decode.
Not a great deal, unless there is a lot of nesting, which is bad style anyway. Text editors can highlight the matching pairs, as well, which is very useful. There are benefits and drawbacks of every style, of course.

I had my doubts about trying to teach a beginner programmer the art of Python indentation, but I think I've come to the conclusion that it makes sense. C programmers tend to indent their blocks as well, even though they have their braces, so why do both when one will do?
It's great until you accidentally get some tabs mixed up in there! It's also pretty easy to screw up the indentation by accident, and then good luck finding that bug.

Of course, the exact names of these statements may be haphazard, and I think it took me fifteen years to understand what WEND meant. COMAL tries to sort this out by unifying the format of the block terminators – e.g. FOR…ENDFOR, IF…ENDIF, WHILE…ENDWHILE and so forth.
COMAL is certainly on the right track there, but i think my main problem is that you can't immediately see the structure, because the symbols that delineate the blocks are just words, they are indistinguishable from other kinds of statements without actually reading them. How do you like XML, by the way?

Lisp may not be so bad, though. When I took a programming course in school at 17, the first teaching language was Scheme.
If computers had been invented by the Arabs, we'd all be using something like Lisp.

Let's face it, PRINT statements will always be a bit special, because they are a bit special. Everyone, from "hello world" to a senior programmer's debug output, uses it. It can afford to be a bit special. Even "printf" is a special case, because usually functions don't take varying numbers of arguments.
Not so... anyone can write a "variadic function" in C. Although it is a pain in the ass so generally nobody does.

I don't see what's so sensible about using "=" as an assignment operator in the first place, or what's so sensible about using infix notation when every other part of the language puts arguments at the end.
That's nice, but what you're suggesting is one of Meagol's beloved BASIC offspring. It's still got the same basic accessibility of BASIC, or Python if you wish.
It really isn't... my gripe with BASIC is not its absence of features, but the lack of generality and structure in the syntax. The "new features" i've suggested are only the logical extension of existing features that are currently only allowed in a single context. "0 to 10 step 2" is a phrase that already appears in BASIC programs. What i'm suggesting is to interpret it as an object in itself, that can be used generically.

Anyway i'm still puzzled by this claim of BASIC's "accessibility". You still have to learn it, just like anything else. It might "look a bit like English," but it isn't English. You can't just type "hack into CIA mainframe". And anyway that doesn't help anyone who doesn't speak English, which is most people in the world. The most you can say about it is that it doesn't look too scary, and maybe a lack of unfamiliar characters might help with that, maybe it's time for a language that uses emoji.

Depends on what you mean by modern. Structured programming was invented around 1959-60 with ALGOL, and there was at least some kind of Lisp.
I mean object oriented programming, functional programming, generic programming... but all you're telling me now is that BASIC was already out of date when it was new.
Mrs Beanbag is offline  
Page generated in 0.05918 seconds with 9 queries