Today almost every aspect of life is run by computers. Thousands of devices and networks communicate via universal logic languages to work in sync, keeping airplanes in the air and Youtube on smartphones around the world.
As Tech rides the technological wave of the future, learning some basic computer language is now a requirement in every degree program. In some schools students are required to learn Spanish. At Tech, it is Python or Java.
As programming languages become more prominent, it is necessary to understand how accessible they are to the general public. What makes them different from formal languages like English? How are they similar? How can computer languages be made more universal for everyone, and not just computer scientists, to master?
In the most basic terms, a computer language is derived from logic. Every command relates back to some mathematical or mechanical operation within the computer.
“A computer science language is generally a command-based set of instructions […] it does not have the full range of expressions that a formal language does. The thing about programming languages is that they don’t have cultural baggage or cultural nuances you need to communicated in a non-CS language,” said Michael Shin, a fourth-year CS major.
A formal language contains an excess of words related to emotion, semantics, and abstract ideas not immediately (or ever) tangible. In addition there are meanings and intentions associated with groupings of words, such as idioms, that are incredibly specific to not only individual situations but cultures.
“In difficulty, it’s really up to the person, because there are people that can immerse themselves in the form of a cultural language, where there are those who can’t immerse themselves in the culture,” said Katie Flint, a fourth-year STAC major.
It can be more difficult to learn formal languages when there is an immense amount of context surrounding the use of words.
Another item that separates computer and formal languages is meaning representation. In Java or C, an operation or variable is represented by one symbol.
In English homonyms are prominent, such as the multiple pronunciations and meanings of “bat” (animal, baseball item, or action). However, in a computer language, the value of a variable can change as well. For instance, before an operation x may equal 5, but afterwards x equals 10, yet the symbol has not changed.
Generally speaking, formal languages rely heavily on meaning to derive function whereas computer languages concern themselves with producing an outcome.
“In natural languages, certain words can have multiple meanings, and one meaning can be expressed by multiple words,” said Anes Fific, a third-year BA major.
However, over time, more and more high-level and conceptual computer languages are being developed to better turn English semantics into logical commands.
The biggest example now would be python, which uses full, common words such as “and,” “for” or “try,” to construct easier to understand operations for the common user. In this way, the computer language is being engineered to approach a wider audience and make programming more simplistic visually.
However, there can be a downside in that, when it comes to computing languages, there is a fine line between user-friendly code that the average, non-CS major typically finds straightforward, and more complex codes that are used for optimum efficiency.
“As you go up the ladder from low-level to high-level people-friendly languages, your operability and power with what you do within programs steadily decline,” Shin said.
Despite their differences, computer languages have been built around formal languages in order to allow a degree of human interaction with processors. Through the creation of new conceptual languages, the two types of language are growing more and more alike, until, one day, programming will be second-nature to speaking for a majority of students.