At the request of my grandchildren, who are beta testers for this content, I am going to video an introduction to programming using Python for people who have no intention of becoming programmers. This multiple video course is designed to provide my viewers with a basic understanding of computer programming so they are better able to engage with computer programmers in their business, social life, or families.
In on-the-job teaching of programming, and tech in general, I’ve found that relating tech concepts to real world analogies very helpful: e.g. relating how a hard drive works by relating it to an office filing system. The drive is a file room, the partitions are file cabinets, the directories are file folders, and of course there are the actual files. One concept people seem to have trouble with is the idea that deleting files doesn’t actually remove them from the drive. So, I started equating it to putting a red “sealed” flag on a file. The file is still there, but you’re not allowed to look at it.
Well, i’m looking forward to watch your video mr. Shapiro!!!
I know nothing about programming. Sounds like this video is aimed at me!
[QUOTE=irvshapiro;n28]I am going to video an introduction to programming using Python for people who have no intention of becoming programmers./QUOTE]
C’mon Irv, let’s get them started right: assembler first, then maybe Pascal… ?
Ah, them were the days.
Why not Fortran or Cobol LOL!!! Just kidding
Assembly is a good start for chip programming as far as i know!
LOL. Well, the point of learning assembler is that it teaches how computers actually work, what a CPU does, and how it does it. This provides a better foundation for learning other languages.
The idea of learning Fortran or Cobol isn’t as crazy as it sounds. They are both in use every day. In fact, Fortran & Cobol programmers are in demand. Cobol has evolved from a mainstream application language to one that’s used to tie front and back ends of databases together: a meta language if you will. It’s durability is really pretty amazing, given that Cobol 1.0 came out in 1958. I believe it’s because it was so well designed. I give Grace Hopper a lot of credit for that. Admittedly, it is declining in popularity these days, but I think that’s due in large part to a lack of qualified programmers.
Fortran, OTOH, is still viable because formula translation is still needed in STEM fields. Of course it has evolved to keep up with modern needs. As Wikipedia puts it, it’s used for “numerical weather prediction, finite element analysis, computational fluid dynamics, geophysics, computational physics, crystallography and computational chemistry”.
BTW, I wrote my first for-profit program using Cobol. It was an add/change/delete payroll program for a gov’t agency. ?
I remember a million years ago when I bought a Commodore Vic 20 from Toys R Us for my son. These days you can have a computerized ring that does a thousand times more than this machine. But it entrapped me. I used to go to the library in Oxnard Cal. to find programs to type into this brain dead beast that it could understand in Basic. The sense of power that it gave me was quite incredible when I see things now. I would guess that was circa 1982 or 3. Oh how times change!
I still have, in my basement storage room, a fully functioning, Kaypro II. I originally upgraded it from 2.5 to 5 MHZ (Kaypro had made a wiring mistake in a divide-by chip which made it so they couldn’t address RAM correctly if they clocked it above 2.5MHZ. Someone discovered the error and put out a fix on Cornucopia. I also rigged it so it could use a hard drive. And I added a black mesh screen over the display, to increase contrast and reduce blooming.
My firstPC was Amstrad 6128 with the tape cassete recorder. In this pc i learned Basic.
i t’hink python is a great place to start. it offers a wide rage of uses with hardware and also software integration. i cant wait to learn something new, cause no matter how much one might think they know about something, there is always more to learn
I’m kind of leery of snakes but I’ll take a look at it.
Don’t let it rattle you! ?
Python is great, but it is an interpreted language, like so many of the languages used today. They are versatile but there is a price to pay: performance. No interpreted language can even come close to the performance of a compiled language. With today’s computers that certainly is not the concern it once was, but it can still be an issue in some situations.
I would ask you what the difference was but I probably wouldn’t understand the answer.
Actually, the difference is not really hard to understand. It all starts with how computer processors work. Virtually everyone knows that CPUs work with 1s and 0s, not language as humans use it. That’s the crux of the issue. In order for a CPU to run programs written by humans, the program must be translated into the CPU language used by computers. The difference between compiled and interpreted programming languages is when that translation takes place.
With a compiled language, the translation takes place once. Someone writes a program and then has another program called a compiler read their new program and translate it for the CPU. After that, every time the program is run, it’s already in the language of the CPU, so no further translation is necessary.
With an interpreted language, there is no up-front translation into CPU language. Every time a program written in an interpreted language is run, a program called an interpreter must translate the instructions into CPU language, in real time… every single time. In computer processing terms, interpretation is an expensive process: i.e. it takes a lot of time. That translates into slower performance. For programs that get executed a lot (perhaps millions of time a day), that time adds up significantly.
The obvious question is why are interpreted languages used at all then? Well, they’re convenient. Non-programmers can learn them without having to understand all that compiler ‘nonsense’. They can be debugged, edited, and modified more easily and quickly than compiled languages. It’s like everything else in life: there is always a price to pay. In this case the trade-off is performance vs convenience.
Obviously, I’m brushing over quite a few subtleties, but the basic thrust is true.
BTW, g-code is an interpreted language. The motherboard in the printer doesn’t actually understand the command G28. That’s human notation. The firmware must translate G28 into 1s and 0s so the processor can send out electrical signals that cause the printer’s X,Y, and Z axes to return to their home position. And no compiling is needed. You can use something like Pronterface to send a G28 command to the printer over a USB cable, in real time. That’s an advantage.
Actually I think I understand. But even compiled languages like C++ are translated to machine language if I understand correctly.
You understand correctly. Sorry, I must have not been clear. All compiled language programs are translated into machine language. As I said, it’s about when that translation takes place, and how often it takes place. Compiled programs are translated once, interpreted programs are translated every time they are executed.
Thank you for sharing your expertise.
After 40 years of experience, my conclusion is that extreme problems require specific programming languages due to performance or memory requirements. However, for many issues that best programming language is the one your team knows, is comfortable with, and speeds your time to market.
I am using Python to teach non-programmers about the art of programming since the limited size of the language (limited verbs and statements, reserved words, syntax), forced code formatting requirement and the dynamic variable typing make it easier to concentrate on teaching programming concepts with less time spent on syntax. Besides, the development of CircuitPython and other Python implementations runnable on very inexpensive single board computers and microcontrollers will allow me to begin to teach folks about designing custom devices.
My full plan is to help thousands of people become “makers,” designing in TinkerCAD and FreeCAD, printing components on 3d printers, integrating these solutions with controllers programmed in Python. Think of the fun we can all have learning together.
On a more serious note, the availability of tools, components, and tutorials is democratizing the world of product design. Today, it is possible to design your product for fun or profit that just 10-15 years ago would have required a team of designers, expensive prototyping services, and specialized expertise.
I am very interested in everyone’s thoughts.
P.S. I am not a fan of Java or Ruby for very different reasons, and while I have not used C# as a professional, I find the syntax quite lovely.