fjsbulimo

Programming Languages: Why not human language?

Programming Languages

Programming language is a set of instructions given to the computer to perform various operations on the data supplied and planned in an orderly manner. Instructions, and data, are placed simultaneously in the computer memory in the form of magnetization of Bits or binary digits. The program concept makes the computer capable of handling any job for which the program is fed into it with data for processing. 

The purpose of a set of instructions is to make the computer perform a specific activity that could be anything from producing a payroll to updating the sales ledger and other complex mathematical computations.

Because computers are not human beings the set of instructions must specify in minute detail the logical steps to be followed in order to solve the problem.  Earlier programmers had to write their programs in strings of 0s (zeros) and 1s (ones that are in the same form as the computer's internal storage or Binary

Why machine language is different from human language?

Human Language Programming has been an area and a niche pursuit for programmers since programming became a possibility. COBOL, a compiled English-like programming language was one of the first attempts at this perspective. Though it was not like a human natural language, if you made a comparison to another machine language that came around the same time (1959), it was in many aspects almost like English.

COBOL was originally envisioned as a program that could allow business managers to create their own programs and not entirely be dependent on those earlier nerdy programmers, but in the end it morphed into evolving multiple generations of professional COBOL programmers.

Four Divisions of COBOL

Identification
  • This is where the programmer and details of the program are specified.
Environment division
  • Definition of the equipment for usage by the source and object programs are elaborated.
Data Division
  • A clear description of files to be used is made.
Procedure division
  • A definition of manipulative actions to interrelate data information.

Note that even if the program is small, it is to include the four divisions hence long wordy COBOL statements.

To understand the import of this perspective, it becomes more like a teacher who tries to do a math question without mathematical notation and because of that impossibility being forced to write a math’s problem as word problems as the answer and not as a question.

And therein is the problem with human language as a programming language. The use of programming languages is for the same reason we can argue that we use mathematical notation than writing the algebra in English or whatever language.

The difference with human language

Programming language is different from human language because it is specific on what it is solving and when to end a cycle and other information that are inferred in our everyday human language.

Example 1. Jane A into B giving C. Which can be much short in e.g BASIC language i.e C=A/B. Take note that COBOL language statement above is more self-defining and that COBOL programs are usually semi-compiled and intermediate code interpreted.

Example 2. If I were to write “Write” “text” it can be very confusing because I involve 4 quotation marks and if I decide to write “Write (text)” it would not make so much sense that what I mean is that “(text)” is referring to any text and not literally “(text)” and so it would make more sense in solving this by writing “Write: text”.

Use of full colon

The full colon serves the same function as the quotation marks. Intonation changes when speaking. It would be wrong to write to someone for example, "write cancel” than the more understandable, say, "Write: cancel".

The example of loop construct is just the most superficial example of how language does not work. The infer is the key part. The question is how does the computer “infer” the intent of the programmer?

What makes programming languages work is the syntax and semantics expressed in a fixed and standardized manner and not subject to changes due to human language, or cultural changes, or a new and cleverer compiler that makes better, but different, inferences. We don’t have to make sure that the inference engine is bug-for-bug compatible with the previous inference engine.

Novice experimental

Write down an algorithm in a natural language of your choice. Now give it to a six-year-old to perform.

    • Open the refrigerator.
    • Take a whit of cheese and a milk carton.
    • Convert the cheese in a ball.
    • Take a sheet of paper.
    • Draw the things that you took out of the fridge in the paper.
    • Draw something more if you want.
    • Else draw a goat and take some more out of the fridge.
    • If you consider that the goat is a pretty drawing do save in the fridge what you took of it.
    • Else if do all the steps again. If there isn’t carton milk take something more.
    • Stick it in the bridge with adhesive tape.
    • Save again the milk in the fridge ant eat the cheese.

Try to enter the above instructions to a computer. Note how much cultural inference the 6-year-old applies. How then can get a computer to do what a six-year-old knows how to do.

Programming languages are different from the human language because they sent information to computers, which have no cultural inference. They translate the information so that it can be read by the microprocessor. They therefore have to be simplified and with unambiguous expressions.

What is your own opinion on the subject? Keep the discussion going in the comment area and subscribe to our weekly newsletter.

PreviousDatabase: What is Database Management System?
NextAccounting vs Computer Science: How do they Relate!

Leave a Reply

Scroll to Top
Share via
Copy link