Homework: Problem Set #1 »

The proof of the "sets" problem and a hint on another problem from last class are on the notes page and Problem Set #1 was posted on the

Assignments Page. #4 is an important example because it defines all arithmetic expressions over addition, subtraction, multiplication, division, unary plus and unary minus. A context-free grammar that defines the syntax of most commonly used programming languages is usually no more complicated than this (in complexity). #5 will be the most difficult problem of the lot. Dr. Workman does not expect a formal proof, but he does expect you to get it started.

Due: (no due date).

**Other administrative stuff…** We will have the first quiz after we finish the Introduction to Finite State Automata. It won't be next week.

## Review :: Phrase Structured Grammars

Chomsky Hierarchy of Grammar. You should have a firm understanding of the formal definitions of the different kinds of grammars. **That will be on the quiz.**

G = (N, Σ, P, S)

P = { u → v | u ∈ V*NV* and v ∈ V* }

Note: u ≠ v

Type-0 (Phrase Structured): No restrictions.

Type-1 (Context-Sensitive): Length of the right part is greater than or equal to the length of the left part ( |v| ≥ |u| ) ⇒ No erasing.

Type-2 (Context-Free): You can have only one non-terminal on the left side. u must be an element of N (u ∈ N). No left/right context. No restrictions on the right part.

Linear (special type of Context-Free): u ∈ N and v ∈ Σ*NΣ* ∪ Σ*

Type-3 (Right-Linear Grammars, also a subset of Context-Free): u ∈ N and v ∈ Σ*N ∪ Σ* If there is a non-terminal (there can be only one) it must be in the rightmost position (no symbols following the non-terminal).

Left-Linear (similar to Right-Linear, special type of Context-Free): u ∈ N and v ∈ N Σ* ∪ Σ*

Type 3 and Left-Linear are also called **Regular Grammars**.

LLG ≡ Left-Linear Grammar.

RLG ≡ Right-Linear Grammar.

**Comment:** u → v, where |v| < |u| is called an **erasing rule**. When u and v are both members of n it is called unit rules. u, v ∈ N (e.g. X → Y).

**Example**

G = ( N, Σ P, S )

N = { S, X, Y }

Σ = { a, b, c }

P = {

1: S → aS

2: S → Xbca

3: X → ccY

4: Y → λ

}

**What type of **__grammar__ is this? Linear. When you mix both left- and right-linear rules you get a linear grammar.

## New Stuff

**Definition:** Let Δ be a language specification (description) ______. The __family of a language__ defined by Δ is

*F*_{Δ} = { L | L = L (δ) for some δ ∈ Δ }

For instance, Δ = Java; δ = a Java program. Δ = Set theory; δ = {x ∈ Σ* | |x|_{0} = 25 }. Δ = English; δ = the set of all symbol strings of finite length taken from the alphabet, "0," and "1."

You can also have *F*_{1} inside *F*_{0}. Language L may have an inifinite number of descriptions in Δ or in other formal systems. In the case of *F*_{1} inside *F*_{0}, *F*_{1} would be the smallest family containing L. Hmm…

If L ∈ *F*_{Δ}. **TRUE or FALSE: **If A ⊆ L then A ∈ *F*_{Δ}. **False!!** (In general).

The family of LLGs is the same as the family of RLGs. These are __Language Families__. As SETS, the __grammars__ overlap (are not equal). In other words, there are examples of RLGs that are not LL and there are example of LLGs that are not RL and there are some that are both. The relationship of the families of grammars may be different from the languages.

What do the grammars look like that belong to both? How would you characterize grammars that are in the inner (overlapping) section?

u ∈ N AND v ∈ N ∪ Σ*

Rules will look like…

X → w, w ∈ Σ*

X → Y, Y ∈ N

What can you say about the family of those grammars?

*F*_{both} = Finite languages over Σ … e.g.:

S → λ

S → aa

S → b

S → cba

All of the languages outside *F*_{both} are infinite languages that can be described by either a LLG or a RLG.

If L ∈ *F* and A ⊆ L then it is not true, generally, that A ∈ *F*_{Δ}.

Consider Σ*.

A language, L, over Σ is __any__ subset of Σ*.

|*F*_{Σ}| is uncountable

*F*_{Σ} = { L | L ⊆ Σ* } = ℘(Σ*)

The family of __computable languages__ over Σ is just the Type-0 language over Σ.

∴ There are subsets of Σ* that are not computable!

**Theorem:** |N| = |Z| (N = Natural Numbers, Z = Rational Numbers = { a/b | b > 0 and a ∈ Integers }

Lemma A. If A ≤ S, then |A| ≤ |S|.

__Proof__: |A| ≤ |S| iff there exists a one-to-one function, f, from A to S.

Choose f to be the identity function; f(x) = x.

If x ∈ A, then f(x) ∈ A ⊆ S. ∴ f(x) ∈ S.

If x ≠ y, then f(x) ≠ f(y). ∴ f is one-to-one.

Lemma B. |N x N| = { (k, j) | k, j ∈ N }

f: N x N → N, where f is one-to-one and onto function.

f(0,0) = 0

f(1,0) = 1 (notice k + j = 1)

f(0,1) = 2 (notice k + j = 1)

f(2,0) = 3 (notice k + j = 2)

f(1,1) = 4 (notice k + j = 2)

f(0,2) = 5 (notice k + j = 2)

f(3,0) = 6 (notice k + j = 3)

f(3,1) = 7 (notice k + j = 3)

f(1,2) = 8 (notice k + j = 3)

f(0,3) = 9 (notice k + j = 3)

.

.

.

f(k,j) = g(k+j) + j = (Σ_{i=0}^{k} i) + j = (k+j)(k+j+1)/2 + j

Now we must show that this function is both one-to-one and onto.

**To show f is one-to-one**, we must show f(x) ≠ f(y) whenever x ≠ y.

Suppose (k,j) ≠ (k',j')

Case (a): k+j = k'+j'

f((k,j))-f((k',j')) = ((k+j)(k+j+1)/2 - (k'+j')(k'+j'+1)/2) + j - j' = j - j'

If j - j' = 0 then j = j'

k+j = k' + j so k = k' which leads to a contradiction of our assumption that (k,j) ≠ (k',j) ∴ j - j' > 0 ⇒ f((k,j)) ≠ f((k',j'))

Case (b): k' + j' > k + j

k' + j' = k + j + a, a > 0

f((k',j')) - f((k,j)) = (k+j+a)(k+j+a+1)/2 - (k+j)(k+j+1)/2 + j' - j

… j = aj + y, where y > 0 which cannot happen!

If we assume to the contrary that the two function values are equal then you get this relationship and that's possible so we conclude that (he trailed off who knows what he said)…

Let m ≥ 0 be a Natural Number. Find (k,j) such that f((k,j)) = m.

Define k to be the largest natural number such that k(k+1)/2 ≤ m.

j = m - k(k+1)/2

e.g. f((6,8)) = 29

Basically that's all we need to establish that f is one-to-one and onto. So what we've shown is that Lemma B is true.

Third step is to show that |Z| = |N|.

Z = { a/b | a ∈ J (integer) and b > 0 }

You can think of Z as a set of ordered pairs (a,b).

Z ⊆ J x N

By Lemma A, |Z| ≤ |J x N| = |N x N| = |N|. ∴ |Z| ≤ |N| and |N| ≤ |Z|

∴ |N| = |Z| *(Can you prove this?)*

We have done something in class to support the red part.

When you're dealing with inifinites is coming up with one-to-one, onto mappings. Unless you can give a formula, you don't have a proof!

**Lemma B** is a hint for problem #2.

## Inductive Definitions

**Inductive Definitions**: Used to prove properties of the language that's defined by using induction. Inductive proofs work particularly well with inductive definitions.

Definition: An **inductive definition** of a set, S, consists of the following:

- Basis Rule (R
_{0}): Declaration or enumeration of some finite non-empty subset of S.
- Finite set (possibly empty) of inductive rules (R
_{i>0}): Rules of the form **if P(x) then f(x) ⊆ S** where x denotes some existing member of S, at the time that you apply the rule, and b(x) is some predicate true about x, and f(x) is a finite set of elements denoting potentially new member sets.
- Completion Rule: a statement of the fact that
**nothing belongs to S that cannot be established by applying some finite sequence of the above rules**.

**Example**

Define Σ*.

__Basis Rule__:

λ ∈ Σ*

__Inductive Rules__:

R_{1}: If x ∈ Σ* then ∀ a ∈ Σ, { ax, xa } ⊆ Σ* So… f(x) = { y | y = ax, y = xa, a ∈ Σ }

S_{n} = n applications of the Inductive Rule