COS 333: Chapter 1, Part 2

Willem van Heerden
11 Aug 202045:47

Summary

TLDRThis script delves into the foundational concepts of programming languages, exploring the influences shaping their design, such as computer architecture and methodologies. It outlines various paradigms including imperative, functional, and logic programming, each with unique approaches to computation. The discussion also covers implementation methods like compilation, interpretation, and hybrid systems, highlighting their trade-offs in efficiency and portability. Finally, it touches on programming environments and tools that aid developers in coding, debugging, and profiling software.

Takeaways

  • 🌟 The lecture discusses preliminary concepts in programming languages, including influences on language design, language categories (paradigms), implementation methods, and programming environments.
  • 🔍 Influences on programming language design are primarily computer architecture and programming methodologies, with von Neumann architecture significantly impacting the nature of programming languages.
  • 🔧 Imperative programming languages, such as C, C++, and Java, are dominant due to their close modeling of the von Neumann computer architecture, focusing on variables, assignments, and efficient iteration.
  • 📚 The evolution of programming languages has been driven by the development of new software methodologies, leading to the introduction of features like functions, modules, data abstraction, and object-oriented programming.
  • 🛠️ Programming language implementation methods include compilation, interpretation, and hybrid systems, each with their advantages and disadvantages in terms of execution speed, error reporting, and portability.
  • 🔄 The compilation process involves several phases: lexical analysis, syntax analysis, semantic analysis, and code generation, resulting in efficient machine code execution.
  • 🔍 Pure interpretation offers advantages like better error reporting and portability but suffers from slower execution speeds due to the need to decode high-level statements at runtime.
  • 🤝 Hybrid implementation systems, like the Java Virtual Machine, balance compilation and interpretation, offering improved error reporting and execution speeds while maintaining portability.
  • 🚀 Just-In-Time (JIT) systems are a type of hybrid implementation that compiles sub-programs into machine code on-the-fly, improving execution speed for subsequent calls.
  • 🛠️ Programming environments encompass a collection of tools that support software development, ranging from simple command-line tools in Unix-like systems to integrated development environments (IDEs) like Visual Studio and NetBeans.
  • 📈 The course will continue to explore the history and evolution of major programming languages in subsequent chapters, providing a deeper understanding of language development over time.

Q & A

  • What are the main factors that influence the design of a programming language?

    -The main factors that influence the design of a programming language are computer architecture and programming methodologies. Computer architecture refers to the prevalent computing hardware at the time of language development, while programming methodologies involve the software development processes and paradigms that shape the features and capabilities of the language.

  • Why is the von Neumann architecture significant to the development of programming languages?

    -The von Neumann architecture is significant because it is the most prevalent computer architecture, and it influences the nature of programming languages by dictating how data and instructions are stored and processed. Most modern computers follow this architecture, which is why many programming languages are designed to work closely with it.

  • What is imperative programming and how does it relate to the von Neumann architecture?

    -Imperative programming is a paradigm where programs are written as a sequence of commands for the computer to perform. It closely relates to the von Neumann architecture by modeling the use of variables as memory cells and assignments as the data transfer between memory and CPU, reflecting the architecture's design.

  • Why are iteration structures efficient in imperative programming languages?

    -Iteration structures are efficient in imperative programming languages because they can leverage the sequential storage of values in memory. By using a counter and incrementing it, programs can easily move through memory locations, making iteration a fast and straightforward process.

  • How has the evolution of programming methodologies led to the development of new programming languages?

    -As new programming methodologies are developed, they introduce new ways of thinking about software development. This evolution necessitates the creation of new programming languages that support these paradigms, leading to the development of languages that better accommodate the methodologies' requirements and features.

  • What are the three main programming language paradigms discussed in the script?

    -The three main programming language paradigms discussed are imperative programming, functional programming, and logic programming. Each paradigm has its own approach to computation and problem-solving, catering to different aspects of software development.

  • How does functional programming differ from imperative programming?

    -Functional programming differs from imperative programming in that it relies on the application of mathematical functions to parameters, without the use of variables, assignments, or iteration. It focuses on pure functions and avoids side effects, leading to a different programming style that is more declarative in nature.

  • What is logic programming and how does it operate?

    -Logic programming is a paradigm where programs are expressed as a set of facts and rules. The programming language uses an inference engine to reason about these facts and rules to answer queries. It is a declarative approach where the focus is on the relationships between data and the logic used to derive conclusions.

  • What are the three main implementation methods for high-level programming languages?

    -The three main implementation methods for high-level programming languages are compilation, pure interpretation, and hybrid implementation systems. Each method has its own advantages and disadvantages in terms of execution speed, portability, and ease of use.

  • Why is pure interpretation less efficient than compilation?

    -Pure interpretation is less efficient than compilation because it involves decoding high-level statements at runtime, which is more time-consuming than executing pre-translated machine code. Additionally, the interpreter has to handle the full source program and symbol table during execution, leading to higher memory and storage requirements.

  • How does a hybrid implementation system, like the Java Virtual Machine, improve upon pure interpretation?

    -A hybrid implementation system, such as the Java Virtual Machine, improves upon pure interpretation by translating the high-level code into an intermediate bytecode, which is then interpreted by a virtual machine. This approach allows for better error reporting and potentially faster execution than pure interpretation, while still maintaining portability across different platforms.

  • What is the role of programming environments in supporting software development?

    -Programming environments provide a collection of tools that support various aspects of software development, such as code formatting, debugging, and performance profiling. They can range from simple command-line tools in Unix-like systems to complex visual environments like Microsoft's Visual Studio, which integrate multiple development tools into a unified interface.

Outlines

00:00

📚 Influences on Programming Language Design

This paragraph delves into the foundational aspects of programming languages, focusing on the factors influencing their design. It emphasizes the impact of computer architecture and programming methodologies on language features and problem-solving approaches. The Von Neumann architecture's influence on programming languages is highlighted, illustrating how languages have evolved alongside changes in computing hardware and software development methodologies.

05:02

🖥️ The Von Neumann Architecture and Imperative Programming

The second paragraph explores the relationship between the Von Neumann architecture and imperative programming languages, which dominate the field due to their close alignment with this architecture. It explains how features like variables, assignments, and iteration are abstract models of the Von Neumann computer's memory and CPU interaction. The paragraph also discusses the inefficiency of recursion in this context and the preference for iterative structures.

10:04

🔧 Evolution of Programming Methodologies and Languages

This section traces the historical development of programming languages in relation to evolving programming methodologies. It starts from the simplistic languages of the 1950s, designed for single-task scientific applications, to the rise of programmer efficiency in the 1960s, leading to more readable languages and structured programming methodologies. The paragraph also covers the shift towards data-oriented computation in the 1970s and the emergence of object-oriented programming in the 1980s, reflecting the ongoing adaptation of languages to support new methodologies.

15:04

🏭 Programming Language Paradigms and Categories

The fourth paragraph introduces the concept of programming language paradigms, categorizing them into imperative, functional, and logic programming. It discusses the prevalence of imperative programming languages and their roots in the imperative model, including object-oriented, scripting, and visual languages. The paragraph also provides a brief overview of functional programming, characterized by the application of mathematical functions, and logic programming, which relies on facts and rules for reasoning.

20:06

🔄 Implementation Methods for High-Level Programming Languages

This paragraph examines the mechanisms for translating high-level programming languages into executable code, discussing compilation, interpretation, and hybrid systems. It explains that compilation involves translating a program into machine language for efficient execution, while interpretation involves real-time translation by an interpreter, which is less efficient but offers better error reporting. Hybrid systems, exemplified by Java, balance these approaches for small to medium-sized applications.

25:06

🛠️ Compilation Process and Its Phases

The focus of this paragraph is the detailed process of compilation, which includes lexical analysis, syntax analysis, semantic analysis, and code generation. It describes how a source program is broken down into lexical units, transformed into a parse tree, and then into intermediate code, which is finally converted into machine code. The paragraph also mentions the role of a symbol table in maintaining information throughout the compilation process.

30:08

🔍 Advantages and Disadvantages of Pure Interpretation

The sixth paragraph discusses pure interpretation as an implementation method, highlighting its advantages, such as better error reporting and portability, and disadvantages, including slower execution and higher memory requirements. It explains that interpretation is suitable for small programs and situations where efficiency is not the primary concern, and notes the resurgence of pure interpretation in web scripting languages.

35:09

🔌 Hybrid Implementation Systems and JIT Compilation

This section introduces hybrid implementation systems as a compromise between full compilation and pure interpretation, often resulting in better error reporting and faster execution. It describes how intermediate code, such as Java bytecode, is interpreted and how Just-In-Time (JIT) systems compile sub-programs into machine code for improved subsequent execution speeds, making them a delayed form of compilation.

40:12

🛠️ Tools and Environments for Programming

The final paragraph discusses programming environments as a collection of tools that support software development in a given programming language. It contrasts the primitive tool provision in Unix-like operating systems with the integrated, complex visual environments found in modern development platforms like Microsoft's Visual Studio.NET and NetBeans, which cater to the diverse needs of software developers.

45:12

🚀 Conclusion of Chapter One and Preview of Chapter Two

The concluding paragraph of the chapter summarizes the fundamentals covered in the course so far and provides a preview of the next chapter, which will explore the history and evolution of major programming languages, setting the stage for a deeper understanding of their development and impact on the field.

Mindmap

Keywords

💡Programming Language Design

Programming Language Design refers to the process of creating a new programming language or modifying an existing one. It is a critical concept in the script as it sets the stage for understanding the various influences and methodologies that shape the development of programming languages. The script discusses how factors like computer architecture and programming methodologies affect the features and problem-solving approaches of a language, as exemplified by the prevalence of imperative programming languages closely modeling the von Neumann architecture.

💡Influences

In the context of the video, 'influences' pertain to the real-world factors that shape the nature of programming languages. The script identifies computer architecture and programming methodologies as two main influences, highlighting how they dictate the features and capabilities of languages. For instance, the von Neumann architecture's impact on the design of imperative programming languages is a direct result of its prevalence in computing hardware.

💡Programming Language Paradigms

Programming Language Paradigms are distinct approaches to writing software, each with its own methodology and philosophy. The script outlines imperative, functional, and logic programming as the main paradigms to be discussed. These paradigms are central to the video's theme as they represent different ways of thinking about and solving problems in software development, with imperative programming being the most prevalent.

💡Compilation

Compilation is the process of translating code written in a high-level programming language into machine code that a computer can execute. The script explains that this method is typically used for large-scale applications where efficiency is crucial. It contrasts with interpretation, highlighting the trade-offs between the speed of execution and the ease of development and debugging.

💡Von Neumann Architecture

The von Neumann architecture is a model of computer organization where the processing unit and the memory are separate, yet share the same bus allowing for the execution of stored programs. The script emphasizes its significance as the predominant architecture influencing the design of imperative programming languages, which often model the von Neumann machine's use of variables, assignments, and iteration.

💡Imperative Programming

Imperative Programming is a paradigm that uses statements to change a program's state. The script describes it as the most popular type of programming, with languages like C, C++, and Java falling under this category. It is closely tied to the video's theme as it exemplifies how programming languages are designed to work with the prevalent computer architecture of the time, namely the von Neumann architecture.

💡Functional Programming

Functional Programming is a paradigm that treats computation as the evaluation of mathematical functions and avoids changing state and mutable data. The script contrasts this with imperative programming, noting that functional languages like Lisp and Scheme rely on the application of functions to parameters, without the concept of variables or assignments, thus offering a different approach to problem-solving in software development.

💡Logic Programming

Logic Programming is a type of programming where the programmer defines facts and rules, and the programming language uses an inference engine to prove the truth of statements. The script introduces Prolog as an example and positions logic programming as a distinct paradigm from imperative and functional programming, highlighting its unique approach to problem-solving based on reasoning and facts.

💡Hybrid Implementation Systems

Hybrid Implementation Systems, as discussed in the script, are a compromise between full compilers and pure interpreters. They often involve translating high-level code into an intermediate language, which is then interpreted or compiled further. The script uses Java and its use of bytecode as an example, illustrating how hybrid systems balance portability and execution speed, which is relevant to the video's exploration of language implementation methods.

💡Just-In-Time (JIT) Systems

Just-In-Time Systems, or JIT systems, are a specific type of hybrid implementation system mentioned in the script. They compile parts of the program into machine code only when they are needed, caching the compiled code for faster subsequent executions. The script explains that this approach is used in modern Java implementations and .NET languages, demonstrating a practical application of the concept of hybrid systems in optimizing program execution.

Highlights

Influences on programming language design include computer architecture and programming methodologies.

Programming languages are developed to work with the prevalent computer architecture at the time of their creation.

Almost all modern computers follow the von Neumann architecture, which significantly influences programming languages.

Imperative programming languages model the von Neumann architecture closely, using variables and assignments to represent memory cells and data piping.

Iterative structures like while loops and for loops are efficient in imperative languages due to the sequential storage of values in memory.

Recursion is discouraged in imperative programming languages due to high memory consumption and slow execution.

The evolution of programming languages is tied to the evolution of software development methodologies.

In the 1950s and 1960s, programming languages were simplistic, focusing on machine efficiency and ad hoc development.

The rise of programmer efficiency in the late 1960s led to more readable languages and structured programming methodologies.

Data abstraction and sophisticated data structures emerged in the late 1970s, focusing on efficient data representation and processing.

Object-oriented programming, introduced in the 1980s, combined data abstraction with inheritance and polymorphism.

Modern programming languages support component-based development, as seen in .NET programming languages.

Language categories or paradigms include imperative, functional, and logic programming, with imperative being the most prevalent.

Functional programming relies on mathematical functions and does not support variables or assignments.

Logic programming is distinct, using facts and rules for reasoning and proving statements through an inferencing engine.

Hybrid implementation systems, like Java's, balance compilation and interpretation for portability and efficiency.

Just-In-Time (JIT) systems compile sub-programs into machine code for faster subsequent calls, acting as a delayed compiler.

Programming environments encompass tools like debuggers and profilers that support software development.

Unix-like operating systems provide programming tools through terminal-based programs, while modern environments offer integrated visual tools.

Transcripts

play00:00

this is the second part of chapter one

play00:03

where we will continue our discussion on

play00:06

some preliminaries related to

play00:08

programming language concepts

play00:11

these are the topics we'll be discussing

play00:13

in this lecture

play00:15

so we'll start off with a discussion on

play00:18

various influences on programming

play00:20

language design

play00:22

so in other words what factors are there

play00:24

that affect a programming language

play00:26

and make the programming language what

play00:28

it is in terms

play00:30

of what features the language supports

play00:32

and how it approaches solving

play00:34

certain problems that all programming

play00:36

languages must solve

play00:38

then we'll be moving on to language

play00:40

categories also referred to as

play00:42

programming language paradigms

play00:44

um so far you will primarily only have

play00:47

experience with one of those paradigms

play00:49

possibly two but we'll be outlining very

play00:52

briefly

play00:53

what the other paradigms are which we'll

play00:56

be discussing later

play00:58

on in this course next we'll be looking

play01:00

at

play01:01

implementation methods so here we are

play01:03

discussing

play01:04

how a system moves from a program

play01:07

written in a high-level programming

play01:08

language

play01:09

to something that can actually execute

play01:12

and so far you

play01:13

should only have had exposure to one of

play01:15

these approaches namely compilation but

play01:18

there are other approaches that a

play01:19

programming language can use

play01:21

as well and then finally we'll be

play01:23

looking very briefly

play01:24

at programming environments so here

play01:26

we're talking about

play01:27

the set of tools that surround a

play01:29

programming language

play01:31

to allow you to do things such as

play01:33

debugging for example

play01:35

or memory profiling

play01:39

when we discuss influences on

play01:41

programming language design

play01:43

we're talking about factors in the real

play01:46

world

play01:47

that affect the nature of a programming

play01:50

language

play01:51

now we can identify two main

play01:53

contributing influences to language

play01:55

design

play01:56

firstly there is computer architecture

play01:58

and secondly there are programming

play02:01

methodologies now we'll be discussing

play02:03

each of these in more detail

play02:05

in the coming slides but to begin with

play02:08

if we look at computer architecture we

play02:11

see that programming languages are

play02:14

developed so that they can work

play02:16

with a particular prevalent computer

play02:19

architecture that exists at the time

play02:21

so in other words whenever a high-level

play02:23

programming language

play02:24

is developed it is developed with the

play02:28

computing hardware in mind that the

play02:30

programming language

play02:31

needs to work with in other words

play02:33

programs written in that high-level

play02:35

programming language

play02:36

must execute on the computer

play02:38

architecture that is most widely used at

play02:41

the time

play02:42

now as you should recall from first-year

play02:44

computer science

play02:46

in cos151 all modern computers or almost

play02:49

all modern computers

play02:51

follow what's referred to as the von

play02:53

neumann architecture

play02:55

so what this means is the von neumann

play02:57

architecture

play02:58

is a major influence on the nature of

play03:01

programming languages

play03:02

both when programming languages were

play03:05

first evolving

play03:06

right through until the present day with

play03:09

new languages that are being developed

play03:12

secondly we have programming

play03:14

methodologies so

play03:15

when we're talking about a program

play03:17

methodology we're talking about

play03:19

a software development methodology in

play03:22

other words the process that one goes

play03:24

through in order to develop

play03:26

software so examples of these

play03:30

then these methodologies might be

play03:33

for example the waterfall model or

play03:36

object-oriented software development and

play03:39

you will

play03:39

learn about a number of these in the

play03:43

third year project subject crop cos

play03:46

301 so whenever a new

play03:49

software development methodology then is

play03:51

developed then this is a new way of

play03:54

thinking about

play03:55

developing a program and what this then

play03:59

means is we have a new programming

play04:02

paradigm

play04:02

that evolves which then means that we

play04:05

need to then develop

play04:07

new programming languages that support

play04:09

that paradigm

play04:11

so as the thinking surrounding the

play04:14

development of software projects

play04:16

evolves so too then do programming

play04:19

languages evolve in order to support

play04:22

those methodologies

play04:25

let's focus for a moment on the

play04:26

influence that computer architecture has

play04:29

on the nature of a programming language

play04:32

so we've seen that the most prevalent

play04:35

computer architecture these days

play04:37

is the von neumann architecture

play04:40

so to recap what we know about the von

play04:42

neumann architecture

play04:44

all computers that follow this

play04:45

architecture store

play04:47

both data and programs in a single

play04:50

shared memory and this

play04:51

is the core concept that was introduced

play04:54

by john

play04:54

von neumann after whom the architecture

play04:57

is named

play04:59

we also see that the shared memory

play05:02

is separated from the cpu where the cpu

play05:05

is responsible for actually executing

play05:08

instructions

play05:09

and because of the separation we also

play05:11

see that instructions and data must be

play05:14

piped

play05:15

from the shared memory of the computer

play05:18

to the cpu for processing

play05:20

and in a similar fashion the results of

play05:22

the processing must be piped from the

play05:25

cpu

play05:26

to the computer's memory now let's look

play05:29

at

play05:30

imperative programming languages which

play05:32

are far and away

play05:33

the most popular programming languages

play05:36

these days

play05:38

so an imperative programming language is

play05:40

the kind of programming language that

play05:42

you will be familiar with

play05:43

so far through your studies languages

play05:46

like c

play05:47

plus and java so what we see is that

play05:50

imperative programming languages

play05:52

actually very closely model the

play05:54

architecture of

play05:56

a von neumann computer so

play05:59

imperative programming languages rely

play06:01

very heavily on the notion of a variable

play06:04

which allows us to store a value with a

play06:07

name

play06:08

associated so a variable

play06:11

is actually then an abstract model of a

play06:14

memory cell

play06:15

which we see in a von neumann computer

play06:19

in a similar fashion imperative

play06:21

programming languages also rely very

play06:23

heavily on

play06:24

assignments because we need some

play06:26

mechanism

play06:27

for placing a value in a variable

play06:31

so assignment statements then are

play06:33

actually abstract models

play06:35

of this piping mechanism between the

play06:38

computer's memory

play06:39

and its cpu so for example if we assign

play06:42

the results of a computation to a

play06:44

variable

play06:45

what we are actually modeling is

play06:48

piping the result of a computation which

play06:51

has been produced by the cpu

play06:53

from the cpu to memory for storage

play06:58

we also see within imperative languages

play07:00

that iteration is very efficient

play07:02

while recursion is discouraged so

play07:04

iteration

play07:05

is efficient because we have values that

play07:08

are stored sequentially within memory

play07:11

and therefore iteration can be used very

play07:14

simply

play07:15

by incrementing a counter

play07:18

to allow us to move from one memory

play07:20

location to another

play07:22

and therefore access values that are

play07:24

stored in sequence

play07:26

within the shared memory recursion

play07:30

on the other hand is discouraged because

play07:32

every time that we perform a recursive

play07:34

call we need to allocate

play07:35

a stack frame and then deallocate that

play07:38

stack frame

play07:40

when a recursive call terminates

play07:43

so what this means then is that

play07:45

recursion will consume a lot of

play07:47

resources in terms of memory and it is

play07:49

also

play07:50

a very slow approach to achieving

play07:53

repeated executions of a set of

play07:57

computations

play07:58

so what this means then is that

play08:00

imperative programming languages rely

play08:02

very heavily

play08:03

on iterative structures such as while

play08:05

loops

play08:06

and for loops so

play08:10

this very close modeling then of the von

play08:12

neumann architecture by imperative

play08:14

programming languages

play08:16

is actually the core most important

play08:19

reason

play08:19

why imperative programming languages are

play08:22

so dominant these days

play08:24

because imperative languages model the

play08:27

von neumann architecture which

play08:28

is the most prevalent architecture today

play08:31

it then stands to reason

play08:33

that these programming languages should

play08:35

also be the most prevalent

play08:39

so here we have a diagram representing

play08:42

the von neumann architecture just to

play08:44

illustrate

play08:45

what i discussed on the previous slide

play08:47

we have our

play08:49

shared memory over here which stores

play08:51

instructions and data

play08:53

we have our cpu over here which performs

play08:56

processing

play08:57

and then we have the pipeline that

play09:00

connects

play09:01

the memory and the cpu so variables

play09:04

are then simulations of locations within

play09:08

memory

play09:10

assignments are simulations of this

play09:13

pipelining

play09:14

moving instructions and data back and

play09:16

forth between

play09:17

memory and the cpu

play09:21

now that we've discussed the influence

play09:23

that computer architecture has

play09:25

on the nature of programming languages

play09:29

we can move on to the influence that

play09:31

program methodologies

play09:33

have on the nature of programming

play09:35

languages so what we see

play09:37

is over time more and more sophisticated

play09:40

program methodologies were used within

play09:42

the computing industry

play09:44

which then also resulted in more and

play09:47

more sophisticated programming languages

play09:49

being developed

play09:50

in order to support these methodologies

play09:53

so if we look at the dawn of high-level

play09:56

programming languages

play09:57

in the 1950s and the early 1960s

play10:01

we see that the applications that were

play10:03

being developed were very simple

play10:05

typically one was looking at primarily

play10:08

scientific applications

play10:10

that were designed for one programmer's

play10:13

use

play10:13

in other words not more widely used than

play10:16

one single person

play10:18

and the programs were also designed for

play10:20

one very clear

play10:22

very specific task so

play10:25

what this meant then was that

play10:26

programmers were not really using

play10:28

very sophisticated program methodologies

play10:31

programs were mostly being developed

play10:33

on an ad hoc basis and programmers were

play10:37

much more worried about

play10:38

machine efficiency so what this resulted

play10:41

in was that the programming languages

play10:43

from this era were fairly simplistic

play10:46

programming languages they didn't have

play10:49

a lot of features that would make them

play10:51

easier to use

play10:53

for programmers and what we see is

play10:56

that the control structures and

play11:00

other features included within

play11:01

programming languages of this time

play11:04

were modeled very closely on how the

play11:07

computing systems of the day

play11:09

actually operated moving on a little

play11:12

then

play11:12

into the late 1960s what we see is

play11:16

the rise of the idea that people

play11:18

efficiency in other words the efficiency

play11:21

of programmers who are actually writing

play11:23

programs

play11:24

was becoming more and more important and

play11:27

actually overshadowing

play11:28

machine efficiency so we then

play11:32

see in that era an increase in the

play11:34

readability of the programming languages

play11:36

as well as the introduction of better

play11:39

more sensible

play11:40

control structures so hand in hand with

play11:43

this

play11:44

we then also see the evolution of the

play11:47

structured programming methodology

play11:50

which led to the ideas of top-down

play11:53

design

play11:54

and stepwise refinement so what this

play11:57

resulted in

play11:58

directly was the introduction of the

play12:01

first abstraction mechanisms in

play12:03

high-level programming languages

play12:05

so things like functions and modules

play12:08

were

play12:09

introduced within programming languages

play12:11

in the late 1960s

play12:13

and what this allowed programmers to do

play12:15

was to structure their programs in a

play12:17

more sensible fashion

play12:18

they would be easier to write as well as

play12:22

debug and this then ultimately resulted

play12:25

in the more efficient writing of

play12:27

programs by the actual programmers

play12:29

themselves

play12:31

then in the late 1970s we

play12:34

see a movement from process-oriented

play12:37

computation

play12:38

to data oriented computation so in other

play12:42

words less of a focus

play12:43

on how the results were actually

play12:46

computed

play12:47

and more of a focus on how those results

play12:49

were actually

play12:50

represented so what we see then in

play12:53

programming languages

play12:55

in the late 1970s is a focus on data

play12:59

abstraction

play13:00

more sophisticated data structures more

play13:02

ways to store and represent data

play13:05

and process that data in a more

play13:07

efficient fashion

play13:08

we see the introduction of concepts like

play13:11

linked list structures for example

play13:14

in this time and then in the middle

play13:17

1980s

play13:18

we see the rise of object-oriented

play13:22

programming

play13:23

so this was an extension of what

play13:25

happened in the 1970s

play13:27

with even more of a focus on the data

play13:30

representation aspect of programs

play13:33

but also then associating behavior with

play13:36

these data structures so we see then

play13:39

that programming languages from this era

play13:42

starting with the programming language

play13:44

small talk

play13:45

introduced the concepts of data

play13:47

abstraction as well as inheritance and

play13:49

polymorphism

play13:50

and these ideas together then combine

play13:53

and give

play13:54

us the notion of object-oriented

play13:56

programming

play13:57

now this evolution is still continuing

play14:00

today

play14:00

what we see in more modern times

play14:03

is the rise of component-based

play14:06

development

play14:07

and this has led to even more

play14:10

sophisticated programming languages

play14:12

languages like the.net programming

play14:15

languages which allow for

play14:17

component based development we won't be

play14:19

focusing on that

play14:20

too much within this course but we will

play14:23

get to some more detail

play14:25

on c sharp which is one of the dot net

play14:28

programming languages

play14:31

we'll now look at the different language

play14:33

categories that we will be discussing

play14:35

through the remainder of this course so

play14:38

language categories

play14:39

are fairly often referred to as

play14:42

programming

play14:42

language paradigms now our discussion

play14:46

will focus on

play14:47

three main paradigms imperative

play14:49

programming functional programming

play14:51

and logic programming we won't be

play14:53

focusing in too much detail

play14:55

on markup and programming language

play14:57

hybrids

play14:59

now the main focus of our discussion

play15:02

throughout this course will be on

play15:04

imperative programming languages and

play15:06

this is because the vast majority of

play15:08

programming languages

play15:09

are imperative in nature i would

play15:12

estimate

play15:13

that probably about 80 percent of high

play15:15

level programming languages out there

play15:17

today

play15:18

could be classified as imperative

play15:20

programming languages

play15:22

so we already touched on imperative

play15:24

programming languages when we were

play15:26

discussing the influence that computer

play15:28

architecture has

play15:29

on the nature of programming languages

play15:32

and there we saw that

play15:33

imperative programming languages focus

play15:36

on support for variables as well as

play15:39

assignment

play15:40

statements and they also offer support

play15:42

for

play15:43

efficient iteration by means of

play15:45

structures such as while loops and

play15:47

for loops now our discussion will also

play15:51

consider

play15:52

object oriented programming languages to

play15:54

fall

play15:55

under the category of imperative

play15:56

languages and the same

play15:58

is true for scripting languages and

play16:00

visual languages

play16:02

some texts consider oo programming

play16:05

scripting

play16:06

and visual languages to be separate

play16:08

paradigms

play16:09

however the textbook that we're using

play16:11

places these all

play16:12

under the umbrella of imperative

play16:15

programming languages

play16:16

and the reason for this is that

play16:19

object-oriented

play16:20

languages scripting languages and visual

play16:22

languages all have their roots within

play16:24

the imperative model

play16:26

so if we are considering examples of

play16:29

imperative programming languages

play16:31

we're basically talking about languages

play16:33

that you up to this point should be

play16:35

familiar with

play16:36

so languages like c c plus and java also

play16:39

scripting languages such as perl and

play16:41

javascript

play16:42

and then visual languages such as the

play16:45

dot

play16:45

net programming languages the next

play16:48

category or paradigm that we'll be

play16:50

looking at

play16:50

is functional programming and functional

play16:53

programming is not

play16:55

very different from imperative

play16:56

programming however the philosophy of

play16:59

functional programming is

play17:00

fairly different so all computation

play17:04

within a functional programming language

play17:06

is done by means of applying

play17:08

mathematical functions

play17:09

to parameters also what is core

play17:13

to pure functional programming languages

play17:16

is that they don't support variables at

play17:18

all

play17:19

which means of course that they then

play17:20

can't support assignments

play17:22

and they also can't then support

play17:25

iteration because iteration typically

play17:27

relies on loop control variables

play17:31

so in that sense functional languages

play17:33

operate in a very different fashion to

play17:35

imperative programming languages however

play17:37

the notion

play17:38

of a function is something that you will

play17:41

be familiar with

play17:42

from languages like c plus and java

play17:46

so some examples then the functional

play17:47

programming languages include lisp the

play17:49

very first functional programming

play17:51

language

play17:52

scheme which is a dialect of lisp which

play17:54

we'll be looking at in some detail a bit

play17:56

later on in this course

play17:57

and then ml and if sharp

play18:01

now what we see in terms of functional

play18:04

programming languages is that

play18:05

they are definitely not as widely

play18:08

represented as imperative programming

play18:10

languages in terms of the languages that

play18:12

are out there today

play18:14

however functional programming is

play18:16

becoming much more popular these days

play18:20

and there are a number of companies out

play18:21

there that are interested in

play18:23

hiring people familiar with functional

play18:26

programming languages so this is

play18:28

definitely a paradigm to keep an eye on

play18:31

then the third category or paradigm that

play18:33

we'll be focusing on

play18:34

is logic programming and this is

play18:36

completely unlike imperative programming

play18:38

and functional programming so within a

play18:42

logic programming language such as

play18:44

prologue which we'll be discussing later

play18:46

on in this course

play18:48

we define facts and we define rules that

play18:51

can be used

play18:52

to reason about these facts and these

play18:55

rules

play18:55

are then context independent in other

play18:57

words they specified

play18:58

in no particular order we can then

play19:01

issue queries to our logic programming

play19:05

language essentially asking the language

play19:07

whether it can prove the truth

play19:09

of a particular statement and then the

play19:11

logic language will use

play19:13

a system that it maintains behind the

play19:15

scenes referred to as an inferencing

play19:17

engine

play19:17

and it will reason based on the facts

play19:20

and rules that you've provided

play19:22

to the language and then

play19:25

try to prove whether the query that has

play19:28

been provided to it

play19:29

is true or not

play19:32

and then lastly we have hybrid

play19:36

markup and programming language systems

play19:40

and so these don't really classify

play19:43

as fully featured programming languages

play19:46

in their own right

play19:47

essentially markup languages are

play19:50

typically

play19:50

used to specify the structure of a

play19:52

document so we're talking about things

play19:54

like html

play19:55

and xml here and some of these markup

play19:59

languages have over time

play20:00

been extended in order to support some

play20:03

programming concepts

play20:05

such as simple selection statements and

play20:08

possibly loops and this allows then

play20:10

for a more sophisticated specification

play20:14

of what a document for example a web

play20:17

document

play20:19

needs to look like and it allows

play20:21

specification

play20:22

of dynamic structure within these

play20:25

documents

play20:26

so examples then of these hybrid

play20:30

languages are jstl and xslt

play20:34

as i've said we won't really be focusing

play20:37

on

play20:38

these in any kind of further detail

play20:40

later on in the course

play20:42

other than this brief mentioning of

play20:45

these approaches next we'll look at

play20:49

implementation methods for high-level

play20:51

programming languages

play20:53

and what we are talking about here is

play20:55

the mechanism that is used

play20:57

to go from a program written in a

play21:00

high-level language which obviously

play21:02

can't be interpreted by a computer

play21:05

and then moving that into something that

play21:07

actually can be executed by a computer

play21:11

so the three main approaches here and

play21:13

the one that you will be most familiar

play21:15

with at this stage

play21:16

is compilation so what we're talking

play21:19

about here then

play21:20

is a situation where programs written in

play21:22

a high level language

play21:24

are translated down into machine

play21:27

language

play21:28

and then this machine language forms

play21:31

then an executable which can then

play21:33

actually be run

play21:34

on its own so at the point where we have

play21:37

an executable we can then essentially

play21:39

discard the program written in the

play21:41

high-level programming language

play21:43

because it is not required for the

play21:45

execution

play21:46

of the machine level instructions so

play21:49

the useful compilation then is typically

play21:53

for very large commercial applications

play21:56

usually where efficiency is very

play21:58

important

play22:00

the second approach that we have

play22:03

is referred to as pure interpretation so

play22:06

in this case

play22:07

there isn't a complete translation of a

play22:09

program

play22:10

from a high level representation down

play22:13

into machine language

play22:15

here high level programs are interpreted

play22:18

by

play22:18

another program which is referred to as

play22:21

an interpreter so in other words the

play22:24

translation of the high level program

play22:27

down into something that can actually be

play22:29

executed

play22:30

happens at run time by means

play22:33

of the interpreter so what this means

play22:36

then is that pure interpretation is

play22:38

usually not quite as efficient as

play22:40

compilation is

play22:42

the primary use for pure interpretation

play22:44

is for small programs

play22:47

because such programs usually execute

play22:49

fairly quickly anyway

play22:51

and therefore we don't have to go

play22:53

through all of the trouble of performing

play22:55

a full compilation

play22:56

but also in application areas where

play22:59

efficiency

play23:00

is not at all an issue

play23:03

and then in the third place we have

play23:05

hybrid implementation systems and

play23:08

and these systems came to the fore with

play23:11

the

play23:11

rise of the java programming language

play23:14

so in hybrid implementation systems we

play23:16

see a compromise between compilers

play23:19

and pure interpreters and these hybrid

play23:22

systems try to strike a balance between

play23:24

the two

play23:25

and sort of leverage the advantages

play23:29

of both compilation and pure

play23:31

interpretation

play23:32

so the use of hybrid implementation

play23:35

systems is typically for small to medium

play23:37

sized systems

play23:38

where efficiency is important but it's a

play23:42

secondary concern

play23:44

other concerns such as the portability

play23:46

of the program code for example might be

play23:48

more important than

play23:50

efficiency so in the next few slides

play23:53

we'll be looking at

play23:54

each of these implementation methods in

play23:56

some more detail

play23:59

now the implementation method for a high

play24:02

level programming language can be

play24:04

incorporated into

play24:06

a layered view of a computer system

play24:09

so this holds for any implementation

play24:12

system

play24:13

whether it is a full compilation process

play24:18

pure interpretation or some sort of

play24:20

hybrid system

play24:22

so in this layered view we have the

play24:24

basic machine

play24:26

that executes machine level instructions

play24:28

right at the core

play24:29

and then surrounding that we have a

play24:32

macro instruction interpreter

play24:34

and then an operating system

play24:37

that passes through then instructions

play24:40

to the macro instruction interpreter

play24:44

however importantly on top of the

play24:46

operating system

play24:47

we then have a collection of compilers

play24:51

interpreters and hybrid systems so for

play24:54

example

play24:55

we might have a c compiler

play24:58

we might have an interpreter for lisp

play25:02

and we might have some sort of virtual

play25:05

machine

play25:06

for a language like java for example

play25:09

so each of these compilers interpreters

play25:12

or hybrid systems then essentially form

play25:16

a kind of a virtual computer so in the

play25:19

case then

play25:20

of a c compiler the virtual computer

play25:24

that we have constructed

play25:26

is then a computer that runs on

play25:29

c code and then this is translated down

play25:33

by means of the compiler to something

play25:35

that eventually can be executed

play25:37

on the bare machine right at the core of

play25:40

the model

play25:40

in a similar fashion a pure interpreter

play25:43

such as the lisp interpreter

play25:45

then is a virtual computer that executes

play25:48

lisp commands

play25:49

and those are then translated down into

play25:52

something that can be executed by the

play25:53

machine

play25:54

and the same is then also true for a

play25:56

hybrid system such as the java

play25:59

virtual machine we'll now look

play26:02

at each of the three implementation

play26:04

methods in more detail

play26:05

starting off with compilation so when we

play26:09

talk about compilation we're talking

play26:11

about

play26:11

the full complete translation of a high

play26:14

level program which is written in a

play26:16

source language

play26:17

such as c c plus plus or java for

play26:20

example

play26:21

down into machine code which is written

play26:24

in a machine language

play26:26

appropriate for the specific machine

play26:28

architecture that the program

play26:30

must run on now what we see with

play26:32

compilation

play26:33

is that the translation process is very

play26:36

slow and there are a few reasons for

play26:38

this

play26:39

and firstly the full program needs to be

play26:42

translated not just a portion of the

play26:44

program that is currently

play26:46

executing so this means that

play26:49

the full translation process will then

play26:52

take a lot of time because there's a lot

play26:54

of code that needs to be processed

play26:57

additionally we see that there are

play27:00

several

play27:00

phases that the translation process has

play27:02

to go through which is relatively

play27:04

time consuming and then finally there

play27:07

may

play27:07

also be a number of optimization steps

play27:10

that need to be performed

play27:12

in order to produce efficient machine

play27:14

code

play27:15

as the final result of the compilation

play27:17

process

play27:18

so very slow translation but in return

play27:20

for the slow translation

play27:22

we get very fast execution and this is

play27:25

because the final result of the

play27:26

compilation process

play27:28

is machine code which is very close to

play27:31

the low-level machine representation and

play27:34

therefore

play27:35

executes very efficiently now the

play27:38

compilation process as i mentioned

play27:40

goes through several phases the first

play27:42

phase is lexical analysis

play27:44

so in this phase the characters within

play27:48

the source

play27:48

program are converted into lexical units

play27:51

and lexical units

play27:52

are the components that a program is

play27:55

built up out of so things like

play27:57

identifiers for variable names

play28:00

would be lexical units also things

play28:03

like operators for example plus and

play28:06

minus symbols denoting addition and

play28:08

subtraction operations

play28:09

would also be lexical units

play28:12

now in the next phase these lexical

play28:14

units are then taken

play28:16

and they are transformed into a parse

play28:19

tree

play28:19

where the parse tree then represents the

play28:22

syntactic

play28:23

structure of the program in the high

play28:25

level programming

play28:27

language source so at this point we've

play28:31

only analyzed the

play28:32

syntax of our program we haven't

play28:34

actually looked at the semantic

play28:36

meaning of the program so that is done

play28:39

in the next phase

play28:40

semantic analysis and here the past tree

play28:43

then

play28:44

is processed in order to generate

play28:47

an intermediate code representation

play28:50

this intermediate code representation

play28:52

can't yet be executed by the machine

play28:55

so the final phase code generation then

play28:57

runs through the intermediate code

play29:00

and it converts it into machine code

play29:03

that can actually be executed by the

play29:05

computer itself

play29:09

this diagram very simply illustrates the

play29:12

compilation process that we've just

play29:14

discussed so we start off with our

play29:17

source

play29:17

program written in a high level

play29:19

programming language like c

play29:20

or c plus plus we then go

play29:24

on to the lexical analyzer which

play29:26

produces

play29:27

lexical units as results

play29:31

um the syntax analyzer then performs the

play29:34

next step which takes the lexical units

play29:36

and generates a parse tree

play29:40

from those units the pause tree is then

play29:43

moved on to the intermediate code

play29:46

generator

play29:48

and potentially there may then be a

play29:50

number of optimizations that need to be

play29:53

performed there

play29:54

and the result of that then is the

play29:56

intermediate code

play29:58

the intermediate code then is passed on

play30:01

to the code

play30:01

generator which generates then machine

play30:04

language code

play30:06

which can then finally be executed by

play30:08

the computer

play30:09

um with of course the addition of input

play30:12

data

play30:12

in most cases and then we finally have

play30:15

results that are produced at the end of

play30:17

this process

play30:19

throughout the compilation process we

play30:21

also have a symbol table that is

play30:23

maintained

play30:24

and the details on the specifics of

play30:27

these various phases however

play30:29

we will leave out for the remainder of

play30:32

this discussion

play30:34

because this is a compiler construction

play30:37

related issue and not the focus of this

play30:40

course

play30:42

we'll now look at the second main

play30:45

implementation method

play30:46

namely pure interpretation now with pure

play30:49

interpretation there's no translation

play30:51

process

play30:52

that converts our high level program

play30:54

down into machine instructions

play30:57

instead we have a program referred to as

play30:59

an interpreter that

play31:00

directly executes the source program

play31:04

now the main advantage of peer

play31:05

interpretation is that it is much

play31:08

easier for programmers to use and this

play31:10

is primarily

play31:11

because the error reporting is a lot

play31:14

better

play31:15

so in the case of a compiled programming

play31:18

language such as c

play31:20

plus you should recall that any runtime

play31:23

error such as a memory fault

play31:25

simply causes the program to terminate

play31:28

and there is no error reporting that

play31:30

refers

play31:31

to a specific line within the source

play31:34

program where the error occurred however

play31:37

in the case of pure interpretation we're

play31:39

directly executing the source program

play31:42

code

play31:42

so if an error occurs we know exactly

play31:45

which line

play31:46

caused that error and that can then be

play31:48

reported to the programmer

play31:50

and this of course then speeds along

play31:53

the debugging process and makes it a lot

play31:56

easier for programmers

play31:58

another advantage associated with pure

play32:00

interpretation

play32:02

is that purely interpreted programming

play32:04

languages

play32:05

are usually much more portable than

play32:07

compiled programming languages

play32:10

so with a compiled programming language

play32:12

the translation has occurred for a

play32:14

specific machine

play32:15

architecture and usually that executable

play32:18

code then can't just be ported

play32:20

to a different system with different

play32:24

hardware and possibly a different

play32:25

operating system and in the case of

play32:28

interpretation

play32:30

usually the source program does not need

play32:33

to change

play32:34

for different architectures it's only

play32:38

the interpreter that needs to be changed

play32:41

for the specific computing platform

play32:43

that the program will be executed on now

play32:47

the main disadvantage with peer

play32:48

interpretation

play32:49

is that we see much slower execution

play32:53

so usually purely interpreted

play32:56

programming languages

play32:57

are between 10 and 100 times slower

play33:01

than compiled programming languages

play33:04

so one of the reasons for this is that

play33:07

instead of decoding machine language

play33:09

instructions

play33:10

we are now decoding high level

play33:12

statements

play33:13

now decoding machine language

play33:15

instructions of course is very efficient

play33:17

because it essentially happens on a

play33:20

hardware level

play33:21

however decoding high level statements

play33:24

is usually a much

play33:25

more involved process and therefore

play33:27

takes up

play33:28

more time another reason for the slower

play33:31

execution

play33:32

comes into play when we are talking

play33:35

about loop structures

play33:37

so in the case of a compiled programming

play33:39

language the loop body would be

play33:41

translated once

play33:42

and then for each iteration of the loop

play33:46

that loop body would then simply be

play33:48

jumped to

play33:49

and executed however in the case of a

play33:52

purely interpreted programming language

play33:54

the body of the loop would have to be

play33:56

decoded

play33:58

for each individual iteration of the

play34:01

loop structure and this obviously slows

play34:03

things down

play34:04

because there are many more decoding

play34:06

steps that need to be performed

play34:10

also very often purely interpreted

play34:13

programming languages require a lot more

play34:15

space in terms of

play34:18

both memory but sometimes also hard

play34:21

drive storage space

play34:23

and this is because the full source

play34:25

program needs to be available during the

play34:28

execution

play34:28

of the program and also the full symbol

play34:32

table

play34:33

also needs to be present at runtime

play34:36

during execution

play34:38

so what we see then is that peer

play34:40

interpretation these days is relatively

play34:43

rare

play34:43

amongst traditional high-level

play34:45

programming languages

play34:47

we will see in chapter 2 that peer

play34:50

interpretation

play34:51

however was actually a much earlier

play34:54

approach

play34:55

to program execution than full

play34:58

compilation was

play35:01

we do see however that more recently

play35:04

there's

play35:04

been a comeback of pure interpretation

play35:07

in the context of web scripting

play35:09

languages

play35:10

such as javascript and php so what i'd

play35:13

like you to do

play35:14

at this point is pause the video and

play35:16

consider

play35:17

why you think pure interpretation is

play35:20

appropriate

play35:21

for web-based scripting languages

play35:24

specifically

play35:27

this diagram very simply illustrates the

play35:29

interpretation process

play35:31

so we simply have then our source

play35:34

program

play35:35

and that source program is fed into our

play35:38

interpreter the interpreter also then

play35:41

receives input data

play35:42

from the user or possibly input files or

play35:45

some other source

play35:46

and then the interpreter directly

play35:49

executes the source program

play35:53

and then produces results by means

play35:56

of that execution

play35:59

the last implementation method is the

play36:02

hybrid implementation system

play36:04

so hybrid implementation systems take

play36:07

various forms

play36:08

but in general they are a compromise

play36:13

between a full compiler and a pure

play36:16

interpreter so typically a hybrid

play36:19

implementation system

play36:20

begins with a program written in

play36:23

a high level language and this code then

play36:27

is translated down to an intermediate

play36:30

language

play36:32

this intermediate language then is

play36:34

interpreted

play36:35

and the intermediate language is

play36:37

designed to allow for easy and efficient

play36:40

interpretation so in general what we see

play36:43

then

play36:44

is that hybrid implementation systems

play36:46

incorporate the best of both worlds

play36:48

and they very often have much better

play36:51

error reporting than

play36:53

a fully compiled language and they are

play36:55

much faster than

play36:57

pure interpretation usually is

play37:01

so two examples then of hybrid

play37:04

implementation systems

play37:06

firstly we have pill programs and perl

play37:08

programs

play37:09

are partially compiled in order to

play37:11

detect

play37:12

errors before they are interpreted

play37:15

now much more interesting example is the

play37:19

java system where a typical java

play37:22

system is hybrid in nature there are

play37:24

some compilers for java that will

play37:26

translate java directly into machine

play37:28

code

play37:29

however the major majority of systems

play37:31

are hybrid implementation systems

play37:34

so here when we compile a java program

play37:37

then we produce an intermediate code

play37:41

form which is referred to as bytecode

play37:43

so if you compile a java program and it

play37:46

produces

play37:47

a dot class file and that dot class file

play37:50

is actually then the intermediate byte

play37:53

code

play37:54

so this byte code then is portable

play37:56

between different machine architectures

play37:58

you can use the same bytecode on

play38:01

for example a windows computer or an

play38:04

apple mac or a linux system for example

play38:08

now the specific machine architecture

play38:11

then

play38:11

must have a bytecode interpreter

play38:14

as well as a runtime system implemented

play38:17

for it

play38:18

so in other words there is then a

play38:21

different code interpreter and runtime

play38:23

system

play38:24

for each individual machine that the

play38:27

java byte code

play38:28

needs to be able to execute on and

play38:31

together the

play38:32

bytecode interpreter and the runtime

play38:34

system

play38:35

is then referred to as the java virtual

play38:37

machine

play38:38

so what this means then is that there is

play38:41

only one portability

play38:43

issue from the perspective of the

play38:44

language designers and that is

play38:46

developing a java virtual machine

play38:48

for a specific platform but once that

play38:51

has been done

play38:52

then any program written in java that

play38:54

has been translated down into bytecode

play38:57

can execute then on this virtual machine

play39:00

and therefore will be portable between

play39:02

different systems

play39:05

here is a diagram illustrating a hybrid

play39:08

implementation system

play39:10

so we can see that the first few steps

play39:13

of

play39:14

a full compilation procedure are

play39:16

executed

play39:17

we have our source program which is fed

play39:21

into a lexical analyzer the lexical

play39:24

analyzer produces lexical units which

play39:27

are passed on

play39:28

to the syntax analyzer the syntax

play39:30

analyzer builds parse trees

play39:33

out of the lexical units and then the

play39:35

parse trees

play39:36

are sent onto the intermediate code

play39:39

generator

play39:40

which generates the intermediate code

play39:43

in the form of something like bytecode

play39:46

in the case of java

play39:47

once we have the intermediate code the

play39:50

remaining compilation

play39:51

process steps are not executed in order

play39:54

to generate

play39:55

machine level code we now simply have

play39:58

an interpreter which then

play40:01

executes on the intermediate code of

play40:04

course

play40:05

it will typically accept some sort of

play40:07

input data

play40:08

and then finally it will then generate

play40:11

results from the execution of this

play40:13

intermediate code

play40:16

a specific and very interesting example

play40:19

of a hybrid implementation system

play40:22

is referred to as a just in time

play40:25

implementation system

play40:26

just in time implementation systems are

play40:29

usually referred to as

play40:30

jit systems so a jet

play40:34

system will first translate the

play40:37

entire program to an intermediate

play40:40

language in exactly the way that we've

play40:42

just

play40:42

seen for any hybrid implementation

play40:45

system

play40:46

however things change in terms of how

play40:49

sub-program

play40:50

calls are dealt with so a sub program

play40:53

is something like a function or a method

play40:55

or a procedure in a programming language

play40:58

so whenever a sub program is called then

play41:02

the intermediate language

play41:04

for that sub program is compiled down

play41:07

into machine code this machine code

play41:10

version is then

play41:11

obviously executed for the call to the

play41:14

sub

play41:14

program however it's kept around

play41:17

essentially it is cached

play41:19

for subsequent calls so what this means

play41:22

then is that the first call to the

play41:23

sub-program will be slower

play41:26

because the compilation process has to

play41:28

take place

play41:29

however every subsequent call to the

play41:31

sub-program will then be faster because

play41:34

we already have the machine code

play41:36

version of that sub program so

play41:39

in essence then a jet system is a

play41:42

delayed compiler it only compiles what

play41:46

needs to be compiled based on the

play41:48

execution

play41:49

of the program so jet systems

play41:53

are seen in a few cases a lot

play41:56

of modern java implementations

play41:59

use jet systems and we also

play42:02

see that the net languages developed by

play42:06

microsoft are also implemented using

play42:09

jitter systems

play42:13

finally we'll take a quick look at

play42:16

programming environments

play42:18

so when we talk about programming

play42:19

environments we're talking about

play42:21

the collection of tools that surround a

play42:24

programming language

play42:25

and support the use of that programming

play42:27

language for the development

play42:29

of software now these tools can take a

play42:32

variety of different forms

play42:34

we might for example have automatic code

play42:37

formatters or code beautifiers

play42:39

that can format format our code

play42:42

according to

play42:43

a specific standard we can also have

play42:46

debuggers

play42:47

and we can have code profilers for

play42:50

example

play42:51

that we can use to analyze the memory

play42:53

usage and overall performance

play42:55

of programs we've written while

play42:58

they are executing now there are

play43:00

different ways that these tools can be

play43:02

provided to a programmer

play43:04

and a very early example of this is the

play43:07

unix operating system

play43:09

when i say unix operating system i mean

play43:12

the unix-like operating systems which

play43:14

also

play43:15

incorporate linux so the unix operating

play43:19

system is an older operating system

play43:21

and therefore the collection of tools is

play43:23

provided in a fairly primitive fashion

play43:25

by means

play43:26

of separate programs that can be

play43:28

executed within a terminal

play43:31

so for example under the linux operating

play43:33

system we have the gnu debugger

play43:36

which is gdb and that is a separate

play43:39

program that can be executed within a

play43:42

terminal now of course these days modern

play43:45

unix-like operating systems use visual

play43:48

front-ends such as kde

play43:50

or gnome however these visual front-ends

play43:53

behind

play43:54

the scenes still run the lower level

play43:58

terminal based tools that are provided

play44:00

within the program

play44:02

programming environments within

play44:05

unix-like operating systems

play44:08

more modern programming environments

play44:10

have very complex visual environments

play44:13

that are

play44:14

very large and very

play44:17

complicated and support the work of a

play44:20

software developer in a lot of different

play44:22

ways

play44:23

so one example of this is microsoft's

play44:26

visual studio.net platform

play44:28

a very complex environment where all of

play44:30

these various tools have been integrated

play44:33

into the visual environment and

play44:36

therefore they are directly

play44:38

accessible to the programmer

play44:41

through the visual environment directly

play44:43

through the editor

play44:45

so this environment then within visual

play44:48

studio can be used to build both

play44:51

web and non-web-based applications in

play44:54

all of the

play44:55

net programming languages and therefore

play44:57

we have support for multiple different

play44:59

languages that can interface

play45:01

all through the same visual environment

play45:05

netbeans is another example of a

play45:08

relatively

play45:08

complex visual environment very much

play45:12

like visual studio.net however it is

play45:14

developed

play45:15

for java web applications

play45:19

right so that concludes our discussion

play45:22

on chapter one

play45:23

we've discussed all of the fundamentals

play45:26

which

play45:26

we will be using throughout the

play45:29

remainder

play45:30

of the chapters that we will be covering

play45:33

within this course

play45:34

in the next lecture we will be moving on

play45:37

to

play45:38

chapter two where we will look at a

play45:41

history

play45:41

of the evolution of the major

play45:43

programming languages

Rate This

5.0 / 5 (0 votes)

Related Tags
Programming LanguagesLanguage DesignVon NeumannImperative ParadigmFunctional ProgrammingLogic ProgrammingCompilationInterpretationHybrid SystemsSoftware DevelopmentMethodologies