From WikiChip
Editing boolean algebra

Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.

The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then save the changes below to finish undoing the edit.

This page supports semantic in-text annotations (e.g. "[[Is specified as::World Heritage Site]]") to build structured and queryable content provided by Semantic MediaWiki. For a comprehensive description on how to use annotations or the #ask parser function, please have a look at the getting started, in-text annotation, or inline queries help pages.

Latest revision Your text
Line 1: Line 1:
 
{{title|Boolean Algebra}}
 
{{title|Boolean Algebra}}
'''Boolean algebra''' (or less commonly '''symbolic logic''') is a [[instance of::branch of algebra]] that deals with only two logic values - [[0]] (corresponding to [[false]]) and [[1]] (corresponding to [[true]]).  
+
'''Boolean algebra''' (or less commonly '''symbolic logic''') is a branch algebra that deals with only two logic values - [[0]] (corresponding to [[false]]) and [[1]] (corresponding to [[true]]).  
  
 
Today, Boolean algebra is the primary mathematical tool used in designing modern digital systems. Switching functions are described using Boolean algebra since they deal with two discrete states - ON and OFF (or 1 and 0). Those functions are in turn implemented via [[transistors]] which act as switches, a natural implementation for representing Boolean algebra operations. Once primitive Boolean operation circuits such as [[NOT]], [[AND]], and [[OR]] [[logic gates|gates]] are implemented, any conceivable system of logic can be implemented using them like Lego pieces.
 
Today, Boolean algebra is the primary mathematical tool used in designing modern digital systems. Switching functions are described using Boolean algebra since they deal with two discrete states - ON and OFF (or 1 and 0). Those functions are in turn implemented via [[transistors]] which act as switches, a natural implementation for representing Boolean algebra operations. Once primitive Boolean operation circuits such as [[NOT]], [[AND]], and [[OR]] [[logic gates|gates]] are implemented, any conceivable system of logic can be implemented using them like Lego pieces.
Line 7: Line 7:
 
{{main|/variables|boolean data type|l1=Boolean Variables}}
 
{{main|/variables|boolean data type|l1=Boolean Variables}}
 
Boolean algebra uses variables just like normal algebra. Those variables can only have one of two values - either a 0 or a 1. Variable are commonly represented as a single alphabet letter. While there is no one acceptable convention, a it's not uncommon to see letters such as <math>A, B, \text{ and } C</math> used for inputs and <math>P, Q, R, \text{ and } Z</math> for output. That's also the convention used on WikiChip. Sometimes it's desired to represent the [[negated]] (opposite) value of a variable, that's often done with a bar or a tick (prime) above or next to the letter, for example <math>\bar A</math> or <math>\neg B</math> although [[negation|other notations exist]]. <math>\bar A</math> is read "not A", regardless of notation.
 
Boolean algebra uses variables just like normal algebra. Those variables can only have one of two values - either a 0 or a 1. Variable are commonly represented as a single alphabet letter. While there is no one acceptable convention, a it's not uncommon to see letters such as <math>A, B, \text{ and } C</math> used for inputs and <math>P, Q, R, \text{ and } Z</math> for output. That's also the convention used on WikiChip. Sometimes it's desired to represent the [[negated]] (opposite) value of a variable, that's often done with a bar or a tick (prime) above or next to the letter, for example <math>\bar A</math> or <math>\neg B</math> although [[negation|other notations exist]]. <math>\bar A</math> is read "not A", regardless of notation.
 
== 0 & 1 ==
 
Boolean algebra deals with only two unique states or values. We often represent those two values as <math>0</math> and <math>1</math>. However it's important to understand that those are just two convenient representations. This is often written as <math>\mathbb{B} = \{0,1\}</math>. You can assign <math>\blacksquare</math> and <math>\blacktriangle</math> instead and the math should work just fine.
 
  
 
== Operations & Truth tables ==
 
== Operations & Truth tables ==
 
{{main|/operations|truth table|l1=Boolean Operations}}
 
{{main|/operations|truth table|l1=Boolean Operations}}
Boolean algebra has a set of operations that can be performed on Boolean values, those operations are conveniently enough called '''[[binary operation]]s'''. The three common Boolean operators are '''[[conjunction|AND]]''', '''[[disjunction|OR]]''', and '''[[negation|NOT]]'''. Understanding those operators can better be done by examining their behavior via tool called a  truth table. '''[[truth tables]]''' is a table that lists all possible input values and their respective output values. Truth tables can be said to be the unique signature of a specific Boolean function. Truth tables are an excellent way of seeing the relationships between input values and given Boolean expressions. While there may be many ways to realize or construct a Boolean function to represent a specific relation, they all share the very same truth table. A [[truth-vector]] is a truth table in vector form.
+
Boolean algebra has a set of operations that can be performed on Boolean values, those operations are conveniently enough called '''[[binary operation]]s'''. The three common Boolean operators are '''[[conjunction|AND]]''', '''[[disjunction|OR]]''', and '''[[negation|NOT]]'''. Understanding those operators can better be done by examining their behavior via tool called a  truth table. '''[[truth tables]]''' is a table that lists all possible input values and their respective output values. Truth tables can be said to be the unique signature of a specific Boolean function. Truth tables are an excellent way of seeing the relationships between input values and given Boolean expressions. While there may be many ways to realize or construct a Boolean function to represent a specific relation, they all share the very same truth table.
  
 
=== AND operator ===
 
=== AND operator ===
Line 85: Line 82:
 
\end{align}
 
\end{align}
 
</math>
 
</math>
 
 
==Order of operations==
 
==Order of operations==
 
{{main|/order of operations|l1=Order of Operations}}
 
{{main|/order of operations|l1=Order of Operations}}
Line 348: Line 344:
  
 
== Minimization ==
 
== Minimization ==
{{main|logic minimization}}
 
Generating Boolean functions for certain logic is often pretty straightforward. On many occasions, however, those functions might be more complicated than they have to be. '''Minimization''' is the process of transforming a Boolean function into its smallest [[logically equivalent|equivalent]] form. While the best circuit design depends on the technology involved, minimization usually translates directly into smaller circuits and lower implementation costs (areas, time, et al).
 
 
One way to minimize a Boolean expression is to simply massage the expression using the axioms and laws described earlier on. Such process is heuristic in nature and thus there is no one algorithm or rules that must be followed. For example, consider the following function <math>f(a,b,c,d) = ab+bc+\bar bc</math>. It can be hand minimized as:
 
 
::<math>
 
\begin{align}
 
  f(a,b,c,d)  &= ab+bc+\bar bc \\
 
                &= ab+c(b+\bar b)  && \mbox{By Distributive Axiom} \\
 
                &= ab+c(1)          && \mbox{By Inverse Axiom}  \\
 
                &= ab+c            && \mbox{By Identity Axiom }
 
\end{align}
 
</math>
 
 
=== Karnaugh Map ===
 
{{main|Karnaugh Map}}
 
While minimization function through manual massaging of expressions works for basic functions, it becomes incredibly complex and time consuming as you increase the number of terms and variables involved. A '''[[Karnaugh Map]]''' (or '''K-Map''') is technique that provides a systematic way to simplify as well as manipulate Boolean expressions.
 
 
{| class="wikitable" style="float: right; margin: 10px;"
 
! colspan="4" | [[Truth Table]]
 
|-
 
! colspan="4" | <math>f(a,b,c) = a \bar b + c(\bar a + b)</math>
 
|-
 
! colspan="3" | Inputs || Outputs
 
|-
 
! A !! B !! C !! Q
 
|-
 
| 0 || 0 || 0 || 0
 
|-
 
| 0 || 0 || 1 || 1
 
|-
 
| 0 || 1 || 0 || 0
 
|-
 
| 0 || 1 || 1 || 1
 
|-
 
| 1 || 0 || 0 || 1
 
|-
 
| 1 || 0 || 1 || 1
 
|-
 
| 1 || 1 || 0 || 0
 
|-
 
| 1 || 1 || 1 || 1
 
|}
 
A K-Map is a actually a [[truth table]] that has been rearranged in a number of important ways to make it possible to visually find the expression we're looking for. An <math>n</math>-variable K-Map is table consisting of <math>2^n</math> cells, each representing a single [[minterm]]. For each minterm where the function results in a <math>1</math>, we put a 1. Conversely, for each minterm where the function results in a <math>0</math>, we put a zero - or more commonly, we leave blank.
 
 
[[File:3-var blank k-map.svg|150px|left]]
 
Let's consider the following 3-variable Boolean function: <math>f(a,b,c) = a \bar b + c(\bar a + b)</math>. Because this function has 3 variables, we need a 3-variable K-Map which means we're working with an 8-cell table. Note the special arrangement of the columns and rows. Particularly, between two adjacent row and columns, only a single variable is allowed to transition from 0 to 1 or 1 to 0. As a result, after "01" we move to "11" instead of "10" which would result in both variables transitioning between 0 and 1. For 3 variables, the table is arranged in 4 columns of 2 rows each. 2 of the variables span the columns while the last variable spans the rows. Our choice for which variable goes where was arbitrary. Any way you want will work so long the k-map is constructed correctly based on a truth table.
 
 
[[File:a and not b or c k-map (1).svg|150px|left]]
 
For every minterm in the truth table where the output is <math>1</math> we mark 1 on the K-Map. The cells that result in <math>0</math> are left blank for convenience. Once the k-map represents the entire truth table, visually inspect the table and locate the largest groups of adjacent squares containing 1. Those groups must have power of 2 number of cells; i.e. a group can only be of size 1, 2, 4, 8, etc.. A group of 3 cells thus must be broken down into 2 groups of 2 where one cell overlaps (see the [[k-map]] article for a more detailed explanation). In our K-Map, we have 1 group of 4 cells and another group of just 2 cells.
 
 
[[File:a and not b or c k-map (2).svg|150px|left]]
 
For each group marked down we look which variables are common to all the cells in the group. For the group consisting of four cells neither ''a'' nor ''b'' are common since they both change, however ''c'' is always 1 for all four cells. Therefore the expression for that group is simply <math>c</math>. For the second group with just two cells, ''c'' is no loner common to the group. However both ''a'' and ''b'' are now common since neither of them change. Because ''a'' is always 1 and ''b'' is always 0 in that group, the expression for the second group is <math>a \bar b</math>. The final simplified expression is the SoP: <math> a \bar b + c</math>. I.e.:
 
 
::<math>f(a,b,c) = a \bar b + c(\bar a + b) = a \bar b + c</math>
 
 
=== Quine-McCluskey Method ===
 
{{main|quine-mccluskey method|l1=Quine-McCluskey Method}}
 
The '''Quine-McCluskey Method''' ('''QMM''') is an algorithm developed for minimizing Boolean expressions. This algorithm is functionally identical to how a [[K-Map]] works but orients itself in tabular form. Due to its algorithmic nature, it's much more suitable to be implemented as a program and can be easily applied to any number of variables and terms. QMM is used in various [[electronic design automation|EDA]] tools.
 
 
==Boolean functions==
 
{{main|/function|l1=Boolean Function}}
 
A '''Boolean function''' is an algebraic function of the form <math>f:\mathbb{B}^n \to \mathbb{B}</math> where <math>\mathbb{B} = \{0, 1\}</math> is defined as [[Boolean domain]] and <math>n \ge 0</math>. There are <math>2^{2^n}</math> possible Boolean functions for <math>n</math> Boolean variables.
 
 
== Properties of functions ==
 
{{main|/function properties|l1=Boolean Function Properties}}
 
The properties of Boolean functions have been a subject of extensive research especially in conjunction with the [[switching theory]]. Understanding the properties of the Boolean functions has proven to help in various stages of logic design (e.g. [[logic synthesis]]). Below are some of the more important properties of Boolean functions.
 
 
=== Duality Principle ===
 
{{main|/duality principle|l1=Duality Principle}}
 
Earlier on it was pointed out that every axiom and every law has an OR form and an AND form. The '''Duality Principle''' simply state that when you take a valid Boolean statement and interchange all <math>\land</math> with <math>\lor</math> and <math>1</math> with <math>0</math> and vice vesa, you obtain its '''dual''' which is also a valid Boolean statement.
 
 
Consider the following statement <math>\overline{(a \land b)} = \bar a \lor \bar b</math> which happens to be [[DeMorgan's Law]]. Then by the duality principle we also know that <math>\overline{(a \lor b)} = \bar a \land \bar b</math> must also be true. Indeed that is the second form of the law.
 
==== Self-dual function ====
 
{{main|/self-dual function|l1=Self-dual Boolean function}}
 
A Boolean function is said to be a '''self-dual function''' if it is [[logically equivalent|equivalent]] to the same function with all inputs and outputs inverted. For example consider the [[majority function]]. It is defined as:
 
::<math>\text{MAJ}(x,y,z) = (x \land y) \lor (x \land z) \lor (y \land z)</math>
 
 
We can derive it's dual function by inverting the inputs and outputs:
 
::<math>
 
\begin{align}
 
      \text{MAJ}^d(x,y,z) &= \overline{\text{MAJ}(\bar x,\bar y, \bar z)} & \mbox{By Duality Principle} \\
 
                          &= \overline{(\bar x \land \bar y) \lor (\bar x \land \bar z) \lor (\bar y \land \bar z)} \\
 
                          &= \overline{(\bar x \land \bar y)} \land \overline{(\bar x \land \bar z)} \land \overline{(\bar y \land \bar z)} & \mbox{By DeMorgan's Law} \\
 
                          &= (x \lor y) \land (x \lor z) \land (y \lor z) & \mbox{By DeMorgan's Law} \\
 
                          &= \text{MAJ}(x,y,z) \\
 
\end{align}
 
</math>
 
 
Since <math>\text{MAJ}(x,y,z) = \text{MAJ}^d(x,y,z)</math>, the [[majority function]] is said to be a self-dual function.
 
===Monotonic functions===
 
{{main|/monotonic function|l1=Monotonic Functions}}
 
The '''monotonicity property''' of Boolean functions says that for two Boolean vectors ''a'' and ''b'' such that <math>a_n \ge b_n \forall_{a,b\in\{0,1\}^n}</math>; if <math>f(a) \ge f(b)</math> then <math>f</math> is a '''monotonically increasing''' function. Conversely, if <math>f(a) \le f(b)</math> then <math>f</math> is a '''monotonically decreasing''' function. In other words, a monotonically increasing (or decreasing) function never has its output changes from one to a zero (or zero to a one for decreasing) by changing zero to a one in the input. Monotonic functions can be constructed with just ANDs and ORs operations without negation. By an extension, [[monotone circuit]]s are circuits in which only [[AND]] and [[OR]] [[logic gates|gates]].
 
 
===Unate functions===
 
{{main|/unate function|l1=Unate Boolean Function}}
 
 
{{empty section}}
 
{{empty section}}
 
===Linear functions===
 
{{main|/linear function|l1=Linear Boolean Functions}}
 
A '''linear function''' is either the constant <math>0</math> function or the exclusive OR of <math>n</math> variables. That is, is one that has the form
 
 
::<math>f(x_0, x_1, \ldots, x_n) = r_0 \oplus \bigoplus^n_{i=1}r_ix_i; r_i \in \mathbb{B} = \{0, 1\}</math>
 
 
For example, consider the following function <math>f(a,b) = a \land b \lor \bar a \land \bar b</math>. Since we can write it in the polynomial form above (i.e. <math>f = 1 \oplus a \oplus b</math>), it's a linear function.
 
 
===0/1 Preserving functions===
 
{{main|/0-preserving function|/1-preserving function|l1=0-Preserving Boolean Function|l2=1-Preserving Boolean Function}}
 
A Boolean function is said to be '''0-preserving''' if <math>f(0, 0, \ldots, 0) = 0</math>. Likewise a Boolean function is said to be '''1-preserving''' if <math>f(1, 1, \ldots, 1) = 1</math>.
 
 
===functional completeness===
 
{{main|/functional completeness|l1=Functional Completeness}}
 
A set of Boolean functions <math>\mathcal{F}</math> is said to be universal or complete iif it's [[/0-preserving function|0-preserving]], [[/1-preserving function|1-preserving]], [[/self-dual function|Self-dual]], [[/monotonically increasing function|monotonically increasing]], and [[/linear function|linear]]. A complete set can form any other Boolean expression by combining the functions in the set.
 
  
 
== Complementary Function==
 
== Complementary Function==
{{empty section}}
 
 
== Additional Logical connectives==
 
{{main|Logical connective}}
 
In additional to the basic AND, OR, and NOT, there are a number of other logical connectives.
 
=== XOR ===
 
{{main|xor}}
 
{{empty section}}
 
 
=== conditional===
 
{{main|conditional}}
 
<div style="float: right;">{{truth table/if}}</div>
 
It's often necessary to express things in the form <math>\text{If } A\text{ then }B</math>. In that form, <math>A \rightarrow B</math> is called a '''conditional''', <math>A</math> is the antecedent, and <math>B</math> is the '''consequent'''. In order for the expression to hold, when <math>A</math> is true, <math>B</math> must also be true. However, conversely, when <math>A</math> is false, the statement is trivially true regardless of <math>B</math>.
 
{{clear}}
 
 
=== biconditional===
 
{{main|biconditional}}
 
<div style="float: right;">{{truth table/iff}}</div>
 
A proposition in the form <math>A  \leftrightarrow B</math> is called '''{{ba|biconditional}}''' and represents the truth function: ''A'' if and only if ''B''.  The truth table is thus identical to <math>(A \rightarrow B)\land (B \rightarrow A)</math>. In fact, the standard method of proving biconditional is to prove <math>A \rightarrow B</math> and then to prove <math>B \rightarrow A</math>.
 
{{clear}}
 
== Propositional calculus==
 
{{main|propositional calculus}}
 
{{empty section}}
 
 
== Posets and Lattices==
 
{{empty section}}
 
 
== Diagrammatic representations ==
 
{{empty section}}
 
===Venn Diagram===
 
{{empty section}}
 
===Logic Diagram===
 
{{empty section}}
 
===Hasse Diagram===
 
{{empty section}}
 
 
== Additional topics ==
 
 
{{empty section}}
 
{{empty section}}
  
 
== See also ==
 
== See also ==
 
* [[Karnaugh Map]]
 
* [[Karnaugh Map]]
 
[[wikidata id::Q173183| ]]
 

Please note that all contributions to WikiChip may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see WikiChip:Copyrights for details). Do not submit copyrighted work without permission!

Cancel | Editing help (opens in new window)

This page is a member of 1 hidden category:

Facts about "Boolean Algebra"
instance ofbranch of algebra +
wikidata idQ173183 +