Wednesday, July 4, 2007

An example of why the OSI model is not a creed

In my first post about the OSI model I explained that the history is that it really became more of a "guideline" of how information gets from one computer to another and how not everything fits into it. Robert Graham at Errata Security posted an interesting blog entry that makes my point.

Thursday, June 21, 2007

How to get rid of the crap that sticks to your computer

Techrepublic's George Ou wrote a great article on how to de-gunkify your PC of most of the junk we get whether we just bought a new PC, installed a ton of programs which run on startup so now the computer boots up slow, uninstalled most of those programs, etc. Basically whenever your computer is either running slow and/or you need more hard drive space this is one of the first things you should do.

He suggests two great little programs: Autoruns and CCleaner and does pretty good job of explaining how to use them and why you are running them.

Thursday, June 14, 2007

The Sufficiency of Scripture - 3

The Form of Biblical Teaching

How could the Bible be a code for human conduct without being exhaustive on how we are to behave; should we expect an authoritative source to be exhaustive on a subject?

Dr. Weeks concludes that the Bible does not subscribe to either exhaustive detail or general principles. He gives a few different examples from both the laws given on Sinai and the laws recorded in Deuteronomy. Anyone who has read the Biblical books of Exodus and Deuteronomy would talk about the massive detail. However, we also have general principles, like the Ten Commandments. There is some interaction between general and detailed laws in the commandments.

An example of interaction is, found between Exodus 20:3 and verses 22-26. The general principle is given in Ex. 20:3 what most Protestants would call the second commandment, which forbids the worship of idols. However, Israel would have no guidance in how to make sacrifices to God if that was the end of the matter. Therefore, the details are given in verses 22-26. The point is the Bible often does not restate the obvious, but does state situation where it might be uncertain if the law applies or not.

In addition, there are also cases where commandments embody a more general concern than just what is specifically stated. For example, in Deuteronomy 25:4 the command is that one should not prevent an ox threshing grain from helping itself to the fruit of its labors. Leviticus 19:13 also prohibits withholding a hired man’s wages. The concern is to allow a worker to enjoy the fruit of their labor. Paul then, in 1 Corinthians 9:6-11, uses the law on the oxen as support for a preacher of the gospel receiving compensation for his work.

It can therefore be seen that the Bible mixes general and specific laws together when needed for clarity, but this can also be extended doctrinal teaching. For example, the issue of salvation gets talked about in Romans, Galatians, and Ephesians but Christ’s incarnation receives a briefer treatment in Philippians 2. The reason quite simple, disputed doctrines receive more treatment than undisputed doctrines.

This premise can be extended to non-theological subjects where most people think the Bible says relatively little about a subject it says nothing about the subject. The question is not: does the Bible give every last detail in science/history/ethics, etc? The real question is: does the Bible say anything, whether in general or in detail, relevant to science/history/ethics, etc?

To ask this question is to answer it, because the textbook argument would not need to be asked if there were no Biblical passages pertaining to these disciplines. However, the ability to point them out does not itself determine their meaning or importance. All it means is that we cannot refuse to consider them.

Thursday, June 7, 2007

The Sufficiency of Scripture - 2

Authority and Exhaustiveness

How would anybody go about deciding if the Bible is authoritative in any area of life? Does not matter what it is - religious, ethics, science, history, etc. - how would you go about determining if you should let the Bible inform your view on anything? Dr. Weeks lists two different ways of going about what for some people is a very easy question and for others is quite a daunting task. Here they are:

  1. Does the Bible claim authority in that area?
  2. Does the Bible have the character of an authoritative source in that area?

We have another problem: the Bible itself may not even employ the same modern distinctions which are mind has grown up hearing for so long we do not know how to even question them. The Bible's authority claims tend be general. For example, Paul writes to Timothy that it is given to furnish a man for "every good work" (2 Tim. 3:17). We look at that and say, "Well Paul meant every good religious work."

What if, just for arguments sake we suppose the “good works” of Paul are broader than the narrower area of modern “religion”? In considering this line of thought we come, to the second question mentioned earlier.

Is the Bible is an authoritative source in science. Before discount it as one we need to ask what in our modern times counts as authoritative source in science. The answer would have to be a science textbook. Does the Bible contain all the laws and facts of physics? No, but what is our standard for a good textbook on physics? We would expect absolute accuracy and exhaustive detail. Any book that lacks some fact or detail would not be the final authority in physics. Therefore, we are demanding exhaustiveness to be a standard of authority.

Many who are concerned with the Bible overlapping into areas where it does not quite measure up are, however, eager to maintain that the Bible is their religious authority. The problem is the Bible is not exhaustive in matters of religion either. There are many religious questions which have no precise and detailed answer. Just think of all the controversies which have divided Christians over the centuries. So, on religious and ethical questions the Bible cannot be an authority either if we are looking for exhaustive detail.

If the Bible is not exhaustive even in religion and ethics, then how could it operate as a code for human conduct? This question also raises another question of whether we are right in expecting an authoritative source to say the last word on every subject.

Wednesday, June 6, 2007

The Sufficiency of Scripture - 1

Introduction

Dr. Weeks says that this book is written as a response to new elements in the ongoing discussions about the Bible. In the past he says lines were drawn between those who affirmed that the Bible was infallible and those who denied infallibility. The point of infallibility didn't mater, historical or doctrinal questions were all the same.

Now the two sides have become less easy to define. Dr. Weeks believes the shift is still in progress and so it is unwise to attempt to categorize the parties. However, he does note two tendencies:
  1. Restrict the area of Biblical infallibility - this tendency manifests itself as attempts to limit the Bible to religious question, or as proponents of this view would put in the heart of gospel. It is expressed by merely saying the Bible is not a science textbook because its focus is on our salvation through Christ. Or that the historical events described in Bible did not necessarily happen as recorded, but that is irrelevant to the gospel.
  2. As a result of responses like these the old evangelical liberal distinction is out of date, because the person who feel free to question the infallibility of Bible in science or history, might still claim a concern to bring men to see the their need of Christ as Saviour and Lord.
Therefore, it is Dr. Weeks aim to take up and examine the issues and arguments in this debate. The book has two parts, part one the theological conviction and part two will be the practical outworking on issue which have become interwoven with the question of the authority and accuracy of Scripture.

Monday, June 4, 2007

The Sufficiency of Scripture by Noel Weeks


I found this book to be a great resource on issues having to deal with how to understand the Bible in today's world. Dr. Noel Weeks knows what it really means for the Scriptures to be in the words of Paul to Timothy (2 Tim. 3:16,17), ". . . inspired by God and profitable for teaching, for reproof, for correction, for training in righteousness; so that the man of God may be adequate, equipped for every good work."

Dr. Weeks not only concisely shows the reader what this means today, but he applies it to issues which are very debatable in the church and outside the church.

What I'm going to do is summarize Dr. Weeks main points in his chapters. I will start by typing out his table of contents. If this book looks interesting you can purchase it through Amazon.com

Acknowledgments
Introduction

PART ONE: BASIC ISSUES
  1. Authority and Exhaustiveness
  2. The Form of Biblical Teaching
  3. General and Special Revelation
  4. Providence and Scripture
  5. The Bible and Technical Precision
  6. Imprecision and Error
  7. The History of Revelation
  8. The Perfect Translation
  9. Words and Meanings
  10. The Bible and the Historian
  11. Words and Meanings Again
  12. The Human Element in Scripture
  13. Contextualization
  14. The Hermeneutical Circle
  15. The Redemptive Focus of Scripture
PART TWO: POINTS OF CONTENTION
Introduction

  1. The Creation Account
  2. The Interpretation of Prophecy
  3. Women in Teaching/Ruling Offices in the Church
  4. Slavery
  5. The Worship and Government of the Church
  6. The Scripture and 'Advances' in Psychology
  7. 'Rabbinic' Exegesis in the New Testament
  8. Pseudepigraphy
  9. Proving the Bible
  10. Freedom and Honesty
  11. The Political and Social Task of the Church
  12. Bible Translation

Conclusion
Index

I will start with the first introduction at another time

A movie about the growth of the Internet

I was reading the latest Internet Protocol Journal (Vol. 10 No.1) and found the following:

BGP: The Movie

Statistics on Internet resources have been animated to provide a high-level overview of the consumption and use of IPv4 addresses and AS numbers since 1983. The animated video also clearly shows the effect of Classless Interdomain Routing (CIDR) and Regional Internet Registries (RIR) allocation policies on consumption rates and routing. This animation was developed by Asia Pacific Network Information Centre (APNIC) staff members, Geoff Huston and George Michaelson. You can download the 58MB movie from:
http://www.apnic.net/news/hot-topics/docs/bgp-movie.mpg



I had the right click the link and do a "Save Link As . . ." to get in to play correctly. It is an interesting way to present the Internet numbers shortage problem, and the voice does a pretty good job not making it overly technical and explaining what is going on; a few of the words might be new:)

Thursday, May 31, 2007

Day 31: The TCP/IP Model

The Internet and most modern networks use the TCP/IP protocol suite to communicate. Whereas, the OSI protocol suite and its corresponding model came about in committee rooms; TCP/IP grew up along side the ARPANet which then became the Internet. TCP/IP development was the result of watching how the ARPANet was growing and coming up with services to provide more functionally to the network.

Therefore, unlike my last post which focused primary on how information theoretically gets from point A to point B I will be talking about actual services.

What does TCP/IP mean?
TCP/IP actually is an abbreviation of two different protocols as indicated by the slash. TCP stands for Transmission Control Protocol and IP is the abbreviation for Internet Protocol.

What does the TCP/IP model look like?
TCP/IP's model only has four layers and is typically shown it relation to the OSI
model.

Its layers have nearly the functions and responsibilities and the model operates the same way. However, TCP/IP looks at the network a bit differently.

For example, in the TCP/IP model there is no Physical layer and as you can see the application layer for the model blurs the top three OSI model layers.

Instead of describing each layer in detail I will only note exceptions from the OSI model in addition to the protocols that work at any given layer and how it is used.

Network Access - Even though the picture shows that this TCP/IP layer includes the OSI model's Physical layer that's a lie meant to confuse you, because it is pure data link layer all the way. So, what runs here? Not all networks use Ethernet or Token Ring and for those scenarios when TCP/IP must operate over a network - phone line, for example - TCP/IP defines protocols to allow communications. There are only two protocols which fit this bill and those are: SLIP and PPP. If I discuss those, it will be for another time.

Internet - Two routed protocols at this layer are IPv4 and IPv6. Internet Control Message Protocol (ICMP) handles the error handling. IP Security happens at this layer. In IPv6 a new protocol called Neighbor Discovery Protocol (NDP) happens at this layer as well. NDP replaces two standards that exist in the current version of IP called Address Resolution Protocol (ARP) and Reverse Address Resolution Protocol (RARP). Finally, this layer has at least nine different routing protocols, they are:

  1. Gateway-to-Gateway Protocol (GGP)
  2. HELLO Protocol (HELLO)
  3. Exterior Gateway Protocol (EGP)
  4. Routing Information Protocol (RIP)
  5. Interior Gateway Routing Protocol (IGRP)
  6. Enhanced Interior Gateway Router Protocol (EIGRP)
  7. Open Shortest Path First (OSPF)
  8. Border Gateway Protocol (BGP)
  9. Intermediate system to Intermediate system (IS-IS)

Some of the protocols on this list are long dead and gone (mostly the top three, but number 5 is in the process of being phased out). The rest either have or will have an IPv6 equivalent. Why so many? Part of the answer is that there exist many differently sized networks and different protocols work (or not) based on the size. The other part is that some of these protocols have been around since the Internet was four networks on the west coast, so as a result of growth new issues arose and some of these protocols were the sum of those Growing Pains.

Transport - Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) are two main protocols that operate at this layer. UDP is basically the small and efficient protocol, but it does not provide any reliability or flow-control features for data. It is used when overhead is not desired like with streaming video. TCP provides all the features of UDP and the reliability and flow-control.

Application -

  1. One network application you rely on every day you get on the Internet and type in www.google.com is Domain Name System (DNS), if this did not work you would need to know IP addresses like 64.233.167.99
  2. The Network File System (NFS) protocol is used more often with the UNIX operating system, but Microsoft do support its use
  3. Bootstrap Protocol (BOOTP) and Dynamic Host Configuration Protocol (DHCP) are similar in function - they are both ways to get IP addresses without needing to type them in every single time the computer starts up - but BOOTP is older and requires more administrative overhead and you are not able to configure as much compared to DHCP
  4. Simple Network Management Protocol (SNMP) allows for remote management of networks devices
  5. File Transfer Protocol (FTP) and Trivial File Transfer Protocol (TFTP) do exactly what their names say, just that one uses TCP and the other UDP
  6. Multipurpose Internet Mail Extensions (MIME), Simple Mail Transfer Protocol (SMTP), Post Office Protocol (POP), and Internet Message Access Protocol (IMAP) all define different parts that make e-mail work
  7. Network News Transfer Protocol (NNTP) transfers Usenet news messages between hosts
  8. Hypertext Transfer Protocol (HTTP) probably the most recognizable of the bunch because web browsers do not hide the HTTP
  9. The Gopher Protocol is a preceded the World Wide Web by providing document retrieval - it is nearly dead if not all dead
  10. The Telnet Protocol allows one machine to establish a remote terminal session on another machine
  11. Internet Relay Chat (IRC) allows real-time chatting between TCP/IP users

There are more protocols, but as you can see most of them are transparent to you.

I hope some connections are being made about how communications go on in networks. A lot of material has been covered between this post and the last about how networks conceptually work and next time I will apply how they actually work.

Tuesday, May 29, 2007

A really great list of God's Work

I was looking through the Reformation Theology blog and found a really good list by Nathan Pitchford titled, "Doctrines of Grace - Categorized Scripture List." Check it out with a Bible handy and prepare to be amazed at what God has done through the work of Jesus Christ!

I also want to note that this is a systematic theology and what that means is only that throughout history as God has progressively revealed more about Himself we (Christians) can look at the whole of Scripture and piece together what's already in the text. It does not mean we come up with an idea of what God should be like and find sources that support that belief.

Happy reading!

Sunday, May 27, 2007

How Microsoft made my sister's laptop freeze then fixed it

Microsoft for just under two years has had this free "service" called Patch Tuesday where every second Tuesday of the month they make updates available via Microsoft/Windows update and automatically through a program called, you'll never guess, automatic updates! Prior to Patch Tuesday, Microsoft's would release patches and updates at any time they saw fit - which was not so considerate for their customers, but when your roiling in dough and don't have customers complaining too much then your fine. Back on track, depending on your configuration of automatic updates it can either be set to: download all the "critical updates" and automatically install them; or download them and an annoying but loving prompt will guide the user through the updates and then install them, and let's not forget to reboot; or disable automatic updates completely and manually download and install the updates. All these options, except the last, make the astronomical assumption that once you install the updates your computer will run, at least, about the same as it always before you installed the update. Microsoft is famous for being quite the contrarian on this point. Mind you, other companies have this problem also.

Now understand that I am not saying automatic updates is a bad program, I am just saying that you need to be aware of what your installing and that Microsoft is not immune from this good advice. The case in point involves update
KB916089 (the "KB" stands for Knowledge Base) which was released on Nov. 14, 2006. I don't know if my sister's laptop was experiencing symptoms before just about two weeks ago, but she told my Dad that her computer was taking a long time to load anything. We feared that this was the same overheating problem she had experienced in late January early February which resulted in her laptop getting a new motherboard.

However, as I looked at the programs running in the background (called processes) I noticed that one of the instances of a process called svchost.exe was going crazy a using between 98-100% of the CPU time. By the way, you can see this information one of two ways - either press and hold down Ctrl + Alt + Delete to bring up the Windows Task Manager then click on the "Processes" tab, or
download and run an even more useful program by Mark Russinovish called Process Explorer. I knew this process was indeed legitimate, but I also knew it wasn't supposed to take up so much attention from the CPU. Therefore, I hit the "End Process" button and proceeded to surf the web.

Within two minutes svchost.exe would start back up and I would end the process again. I knew something was wrong so I looked on the net and there were
pages that would basically suggest that my sister had a virus and here's how you fix it, but that did not solve my particular problem, so I called Dell. The technician explained the proceeding about the Microsoft update gone awry and had me boot into safe mode then download and apply KB927891. The problem was both caused and fixed by Microsoft.

If you read the
Security Advisory for KB927891 and read the "More Information" for KB916089 what you find out is that a scan that Automatic Updates performs was going bonkers. I hope this will save someone a lot of trouble if they are experiencing the conditions I describe.

Tuesday, May 22, 2007

Day 31 - The OSI Model

I had a difficult time deciding if I was going to talk about Microsoft, Cisco, or the Bible so I just decided "why not networking?" Therefore, the most logical place to start in networking is the OSI reference model. By the way, I am going to be using this blog to help me recertify my CCNA and will be following the organization of Scott Bennett's 31 Days Before Your CCNA Exam but I will be referring to primarily internet articles for more information.

What does OSI mean?
OSI stands for Open Systems Interconnection.

A brief history
Before the 70s when computer were really becoming desirable by researchers everybody had to write their own protocols and software for the specific mainframe that they was on their campus. There was no way no collaborate with other students and professors and most cases they didn't want to because they feared that would mean reinvesting in new hardware and reprogramming code.

Therefore, the International Organization for Standardization (ISO) took up the charge to create an universal set of protocols called the OSI protocol suite. Alongside their protocol suite ISO created a model which showed how everything should fit into place. It was nice, neat and had clear boundaries. In the early 80's it competed with the TCP/IP protocol for usage by vendors. TCP/IP won by the way. However, the model has stuck around to be relegated to a mere tool for education and for developers to use as a "guideline" forever to be separated nearly from being the law of the land. As a result of it losing the battle most technologies do not perfectly fit within the seven layers.

Why is it something I should know?
The OSI reference model is important to understand because it explains and "simplifies" how devices on a network communicate. The model splits this great big black box known as a "computer network" into seven pieces henceforth called layers which play a particular role in getting information from point A to point B. One example of how the OSI model is practical is that the OSI model organizes my brain whenever I need to solve a connectivity problem between machines. Both article number one and number two by Techrepublic's David Davis explain how this model can be used in a few different ways to solve network problems. Another application is that it can help developers clarify the roles performed by each component in a networking system.

So what are these layers?
Here's a picture of the seven layers and the names of the PDUs (Protocol Data Units) at any given point during transmission; the boxes in the middle. It's important to note that each PDU has a specific format.

For now lets continue with a high level overview. Each layer is responsible for performing its specific task and dealing with the layers above and below. The lower layers (1-4) deal more moving the data around and transition from hardware and software as you proceed up the model. The upper layers (5-7) are concerned primarily with user interaction and implementing applications over the network than how the data is delivered. As a result, the lower layers are handled by every device in between source and destination and the upper layers are handled by software. Each layer only interacts with its adjoining layer via service data units (SDU) and logically with its peer layer on the destination computer.

What's done at each layer?

Physical
- Bits physically pulse or wave their way over the network media representing 1s or 0s. This layer is responsible for:

  • operation of hardware devices
  • various encoding and signaling functions that transform data to signals
  • transmission and reception of those signals
  • voltage of electrical current to transport signal
  • media type and impedance characteristics
  • physical shape of connector to terminate media
  • physical topology

Something the physical layer does not describe is the transmission media itself, so feel free to shoot data across whatever media you like as long as everything leading up to that point is within specification.

Wireless technology has also added to the list of what operates at this layer: frequency, amplifiers, antenna mode, and radio role.

As with layer 2 and 3 networking devices operate at different layers of the model. When networks talk about devices at a layer they typically are saying what the highest layer is that such and such device operates at and are implying that said device also operates at lower layers. Therefore both a hub and a repeater operate no higher than the physical layer. I'll talk more about what that means in terms of network traffic in a later post, but if you take what I have all ready described about the physical layer you'll already have a good picture.

Data Link - Many wired and wireless LAN (Local Area Network) technologies function at this layer. For example, switches, bridges, and even the network interface card (NIC) that’s probably connecting you to the Internet right now. At this layer information stops getting less physical and more logical. Therefore, another standards body called the Institute of Electrical and Electronics Engineers (IEEE) for their 802 Project divide the data link layer into sublayers known as media access control (MAC) and logical link control (LLC).

The MAC sublayer is the physical half is used by devices to control access to the network medium which includes physically transmitting and receiving data as well as having a physical address assigned by both the IEEE and the card's manufacture. Rules are made and implemented at this sublayer for managing the shared medium to avoid conflicts. I will return to the MAC layer another time.

The LLC sublayer is the logical half of the data link layer and provides services to the network layer (above it) and hide the rest of the details of the data link layer. It also identifies the upper-layer protocol, control functions, and connection services. This sublayer can also provide reliable delivery of data frames.

Network - Provides (logical) connectivity and path selection between two host systems that might be located on different networks. Additionally, this layer is where logical addressing happens. Routers embody this layer as they are the devices that manage almost everything at this layer and divide networks. The functions at this layer are:

  • Logical Addressing - An address that devices communicate with that is independent of hardware and unique across the entire network; the implementation is called a routed protocol
  • Routing - Moving data across a series of networks. Routing protocols do this by communicating with other routers to update and maintain tables thereby supporting routed protocols
  • Packet Encapsulation - Takes data from higher layers and places them into packets
  • Error Handling and Diagnostics - Special protocols allow logically connected devices to exchange information about the status of hosts on the network or themselves

The use of the network layer is optional in data communications. It is only required if either the corresponding host resides on another network, or if an application requires its services.

Transport - This is a transition point between the lower layers that deal with data delivery issues, and the higher three layers that work with application software. Unlike the data link layer, the transport layer can provide this function beyond the local LAN segment because it’s on top of the network layer. Or to put it another way, the transport layer only performs its operations at endpoints, while the data link layer performs its operation at every stop along the path. The transport layer does the following:

  • Segments (or divides into smaller pieces) upper-layer application data
  • Sends those segments from one end device to another end device
  • Process-Level Addressing - This allows the computer to differentiate between software programs and different instances of the same program
  • Multi-/Demultiplexing - Using the above addressing, transport layer protocols combine (multiplex) the data from many processes into a single stream of data to be sent and demultiplex is the opposite when it arrives at its destination

The layer may also:

  • Establish end-to-end operations
  • Flow control provided by sliding windows
  • Reliability provided by sequence numbers, acknowledgments and retransmissions


Session - Set up, manage, and tear down sessions between programs exchanging data. Instead of protocols per se, software at this level resembles tools called application program interfaces (APIs) which allow programmers to develop networking applications without needing to know lower-level details.

Presentation - Responsible for managing the way data is encoded. It differs from the other layers in two key respects. First, it’s more limited and specific it function than the others. Second, it is not used as often and is not required by many protocols for communication. Here are some of specific types of data-handling issues the presentation layer handles:

  • Translation - On any given network different types of computers can exist such as PCs, Macs, Linux systems. Each has its own characteristics and represents data in different ways. The presentation layer hides these differences between machines.
  • Compression - Compression and decompression may be done here to improve throughput of data
  • Encryption - Some types of en/de-cryption are performed at this layer to ensure the security of the data as it travels down the protocol stack. For example, the SSL protocol.

Graphic standards such as PICT, TIFF and JPEG operate here as well as sound and movie formats MIDI and MPEG just to name a few.

Application - When I first was learning about the Application layer I made the mistake of thinking this layer was taking about Internet Explorer and Netscape Navigator but these are user applications. What the model's creators had in mind is network applications which serve user applications outside of the OSI Model.

For example, the two browsers (think Firefox instead of Navigator) both use the same network protocol that operates at the application layer called Hypertext Transfer Protocol (HTTP). With a few exceptions, all of your different web applications, e-mail clients or whatever else you use the internet for use some network application at the application layer.

I'm finally done writing this :) Trust me, if it's any good it won't take you as long to read it as it took me to write it. Please tell me if it lagged anywhere or I was unclear anyplace. If it seemed too academic then I achieved my goal because next I'll write about how the Internet really works, the TCP/IP protocol; hopefully it will be a bit shorter too. I didn't write all this out of my head, so here are all my other references:

  1. Data Communication & Computer Networks: A Business User's Approach, 2nd Ed. Curt M. White. Thomson Course Technology, 2002.
  2. Cisco Networking Academy Program CCNA 1 and 2 Companion Guide, Revised 3rd Ed. Cisco Press, 2004.
  3. Cisco Networking Academy Program Fundamentals of Wireless LANs Companion Guide. Cisco Press, 2004.
  4. IP Routing Fundamentals. Mark A. Sportack. Cisco Press 1999.
  5. The TCP/IP Guide: A Comprehensive, Illustrated Internet Protocols Reference. Charles M. Kozierok. No Starch Press, 2005. The online version is at: http://www.tcpipguide.com

Wednesday, May 16, 2007

Reason for being

The first order of this blog is to state it's purpose. My chief reason is to create a catalog of my different thoughts and ideas. This blog is primarily created for selfish reasons. I am becoming more and more aware over time that I'm forgetting things that I learn or hear about that I think are useful and so rather than lose this information to the great abyss of my mind I would rather try to collect it in a place where I will have easy access to it and where it might be of some use to others.

Given the above, what type of content will I write about? From the outset at least, I plan to write about my continuing studies with routers, switches, anything related to wireless networking and whatever else I'm learning about dealing with computer networks. For example, one topic I'll probably frequently write about is how easy it's to crack WEP (Wired Equivalent Privacy). However, I've still got quite a lot to learn, so I won't know everything. The format will probably be questions followed by answers, which will most likely include links, to various Windows/Cisco problems I'm involved with in class and at home at any given time.

My interests do not just lie in computers I am also a Christian and therefore I will most certainly write about whatever I'm reading, listening to, or thinking about from that perspective. I also have a brother, a sister, and a father who have different callings in life and so attempt to keep up with whatever those different real or at this point theoretical vocations are. Needless to say God provides me with quite a bit of writing material!

A lot of what I'll be doing is referencing articles or books, because I read more frequently than I type. Another habit that influences the preceding sentence is that I enjoy thoroughly understanding an issue then I proceed to tell others what I found. As a result, when I do type it tends to be a bit longer than I thought when I started and probably can come across as overkill. With this understanding in mind I’ll end my first post.