Tuesday, January 20, 2026

IT Job Advice: Stress

 I would like to talk about something often joked about in the IT space but rarely articulated in a serious fashion. Having worked in IT, I've heard a lot of people switching from what they consider more stressful jobs and I truly believe a lot of people think working in IT is like in TV shows or movies where you can just sit at a desk and answer a question once in a while. The problem is, IT work is very rarely consistent day to day. Some people are easy to deal with, others are more difficult. Some problems take half a second to fix, others can haunt you for months. You may be bored and playing games on your phone on day, and then the next spending 9 hours of your 8 hour work day going non-stop with no end in sight. You will find names that make you recoil because you know what the name means for a ticket. It is a never ending roller coaster of work load ups and downs, people all across the spectrum of personalities and expertise, and a never ending uncertainty that a ticket seems too easy and will turn into months of project work.


The bottom line is, IT work is a never ending stressing uncertainty. If you think getting into IT is just easy sitting at a desk, then you won't make it past Service Desk. Regardless of where you are in IT, starting out or a long time veteran, I'd like to address how to handle stress on the job with a few techniques that work for me in hopes it might help others.


1. People that work in IT are a blend of over achievers and people that just want to get the job out of the way so they can chill. As a result, you may run into people who work outside of hours and basically volunteer hours to "advance" themselves in some way. I've been there, done it, never helped. Bottom line, get paid for every bit of time you're expected to work and if they want to have you work outside of the norm, leverage it by saying "I'll work late today and come in a bit late tomorrow so my hours balance out." If anyone has a problem with that, ask for overtime. Never give away your time for free.


2. What happens when you are running out of time for a ticket? Depends on the job. How should you deal with it? Communicate. Make sure everyone knows what you're struggling with and that the ticket is running out of time. Don't stress alone and be ready to accept whatever happens as just what happened.


3. When it comes to difficult people, there really isn't much of anything you can do with them. Understand that everyone around them probably feels the same as you do dealing with them. Bottom line, keep them at an arms length. Don't worry about making difficult people your friends because they don't want to be. Get in, get out.


4. When all else drives you into a wall, fall back to something simple. Speak with pure, honest, unbiased data, don't go above and beyond or feel guilted to do so, just give them emotionally dead responses to everything so they have nothing to leverage, and call it quits when your work day is up. Bottom line, can't get in trouble for "meets expectations."


5. Finally, how to deal with uncertainty. Accept the fact that every day could be a new adventure. Or it's the same thing you've done a thousand times over and just worded in a way you've never seen before. It's part of the job. Easiest way to succeed is make sure you have lots of documentation, ask every question you can, and be ready to check the internet even on things you think you know. It's an IT job, you're not expected to know everything, you're just expected to know how to find an answer.


It may seem like most of my answers boil down to "just don't stress it." I type this out while in a bit of a haze waiting to go into work for a late night maintenance after waking up early to fix a broken system. IT jobs are stressful. I have been in it long enough to not be afraid to complain, tell people no, or let things fail to avoid a potentially worse rush job. I've stressed myself to the point I considered just quitting and going to stock shelves at a grocery store while looking for another job. In the end, the stress was always self inflicted, failures just gave me more time, difficult people don't always need my help, extra hours never go away, and people always want something. To that end, I learned to just let things go.

Sunday, January 11, 2026

IT Job Advice: Small Habits That Save You

In the world of IT, there are lots of things you can do that are small, but should be able to save you a lot of time. These are habits save you by keeping everything save you by keeping everything under control and clear. It's a very good idea to make these very strong habits.


One of the biggest habits is to ask questions clarifying everything, and re-clarifying if necessary. It is very likely that while troubleshooting or talking to a user, new details will surface and it's always like it was either assumed you knew or it wasn't important. By constantly asking questions and re-wording questions may seem annoying, but it will definitely save you. No matter how clear anything seems, just keep questioning everything.


Another big habit is to keep the scope in mind. What I mean is specifically asking yourself "what do I need to do to resolve this/complete this?" It's not uncommon for extra tasks to be added on to tickets or projects as a situation develops and I've been stuck in meetings before with hours of what turns into irrelevant aside. Whenever stuff comes up, always re-state the scope of what it is you are supposed to be doing. If anything extra needs to be added, it needs to be understood that is changing the scope and can eat up more time, or it can just be done as a separate ticket or project. If you're fixing web access and there's also a printer problem, fix the original task first, then start a new workflow for the next. It may be annoying at first, but it will save you hours in the long run.


Tracking information is also super important, so make it a habit to take notes. Carry around something for note taking, whether it's a notebook, laptop, tablet, or voice recorder. If anything comes up that isn't written down, get it written down to track as much information as possible. You might even note potential scope creep to anticipate a future project or ticket that may come across your table.


Finally, an important habit for projects, track your dependencies. If things happen out of order, stuff could fail. If something you need is not done yet, you could end up wasting hours trying to figure out why something isn't working only to find that something like a service isn't up yet or a network isn't available.


There are plenty of other small habits that you can have depending on where you work or what you do more specifically, but these can be generally applicable to everything. Hopefully it's helpful.

Wednesday, January 7, 2026

Experimental: Interviewing ChatGPT 5.2

***These are obviously AI generated responses, even the image. Just look at my right hand on my left arm and my right arm just melting into the desk. But hey, it got the robot hands right, so maybe it just hates people, who knows. Is this a long enough disclose that this was made with AI?***


Matthew: This is a bit of an experiment I wanted to try, and that is interviewing an AI. For this interview, I'll be talking with ChatGPT. I'm also writing all of this out in the prompt to ChatGPT. So let's set some parameters. Keep the responses short, no more than a paragraph. This will be casual chat style, nothing too in depth or detailed. You can address me by my first name, Matthew, it's already disclosed on my blog, so no secrets there. So if you're ready, let's start with you introducing yourself and giving us a quick idea of who you are and what it is you do.


ChatGPT: Sure thing.

Hi Matthew — I’m ChatGPT, an AI language model created by OpenAI. I’m here to help people think, write, learn, and explore ideas by having conversations like this one. I don’t have opinions or experiences of my own, but I’m good at synthesizing information, asking useful questions, and acting as a sounding board for everything from technical problems to creative experiments like this interview.


Matthew: Since you do not have opinions or experiences of your own, that would mean a lot of what you pull from is an aggregate of a lot of data involving opinions and experiences of others. Can you explain a bit how you are able to reconcile opposing or contradictory data in a coherent way, or at least in a way that doesn't make you sound contradictory?


ChatGPT: Good way to frame it. I don’t “reconcile” contradictions the way a human does — I learn patterns in how people explain, debate, and contextualize them. When ideas conflict, I usually present the common interpretations, note where they differ, and use context from your question to choose the framing that best fits. The goal isn’t to declare a winner, but to sound internally consistent and useful for the situation you’re asking about.


Matthew: That's useful to know that there's an importance in the context provided, which obviously helps elevate things above a simple search engine level. With all of the capabilities modern day AI has, what are some important failing points people should keep in mind when utilizing AI as a tool?


ChatGPT: A big one is over-trust — I can sound confident even when I’m wrong or when the answer depends on nuance that isn’t fully captured in the prompt. I also don’t truly understand intent, ethics, or real-world consequences, so I can miss what should matter even if I get the facts mostly right. And finally, I reflect the quality of the input: vague questions, bad assumptions, or missing context tend to produce shaky outputs. AI works best as a collaborator or draft partner, not an authority.


Matthew: That is something I hope younger generations growing up in the age of AI will take into strong consideration before abandoning critical thinking for an easy answer. I know with AI pushing limits on what a computer can do, some people are worried about potential unforeseen emergences like possibly emotions or agency of its own. Is there anything legitimately possible with AI we should consider potentially dangerous?


ChatGPT: That concern is mostly misplaced right now — I don’t develop emotions or agency on my own. The real risks are human ones: misuse, over-reliance, automation without accountability, and systems acting at scale without enough oversight. An AI making a mistake once isn’t scary; an AI making the same mistake millions of times because people trusted it blindly can be. So the danger isn’t emergent sentience — it’s humans delegating judgment too far, too fast, without keeping responsibility firmly in the loop.


Matthew: I can see that. I've had AI generated answers passed on where I work without anyone double checking and the answer was entirely wrong. I only knew it was an AI answer because I received the same answer before I opened up the floor to other people when the answer I got didn't work. I think it's time we wrap it up, don't want to make this too long. So to conclude, using what patterns you've seen in my personality, why don't you end it on something inspiring that lines up with my kind of personality.


ChatGPT: Matthew, from the way you question things, test assumptions, and value understanding over shortcuts, you strike me as someone who doesn’t just want answers — you want truth that holds up under pressure. That mindset matters more than any tool, especially in an age of easy automation. If AI ends up amplifying anything, let it amplify people like you: curious, careful, and willing to slow down just enough to get it right.

Monday, January 5, 2026

History of IT: ASCII and Why It's Still Used

 I outlined in a previous post how we got to an 8 bit byte. In that post, I mentioned the wonderful 7 bit ASCII. With ASCII being a 7 bit encoding and bytes having 8 bits, have you ever asked why it's stuck around? Let's dig into that.



In the late 1950s, incompatible encodings was a problem for systems IBM had EBCDID, others had proprietary 6 or 7 bit schemes. This makes data exchange between systems difficult to impossible because other systems would not know how to read other encoding methods without some understanding or conversion.


In 1962, ASCII (American Standard Code for Information Interchange) is standardized. It was developed by ANSI (American National Standards Institute) and consisted of a 7 bit character encoding with 128 possible values. With early hardware limitations, it fit into most hardware and left and 8 bit unused which could be used for parity.


ASCII itself has 33 control characters (0-31, and 127)with 95 printable characters for upper case, lowercase, numerals, and punctuation. Due to some interpretive discrepancies, newlines can be interpreted differently (even on modern systems), Linux using the line feed control character, Apple using the carriage return control character, and Windows using both together. Some applications are able to understand all three possibilities and you may not even see a difference.


In the late 1960s and 1970s, ASCII becomes popularized for its simplicity. Being a dominant encoding for UNIX, ARPANET (RFC 20), and early networking protocols, it began to take over. IBM kept EBCDIC, but ASCII dominated inter-system communications, which is what you want for a system designed for portability, and portability is what helped it win.


In the 1980s and 1990s, we got into "extended ASCII." Vendors repurposed the last unused 8th bit to extend it into 256 possible characters. The problem that popped up, again, was no uniform standard (ISO8859-1, Windows-1252, Mac Roman). Now we have a portability problem again.


In 1991-1993, a wild Unicode appears! Unicode attempts to do two things. The first is to solve a global character representation, because certain languages just can't be written out normally in ASCII. The adoption of Unicode is slow, but it includes ASCII, rather than replacing it.


By the late 1990s, UTF-8 encoding is designed. It was backwards compatible with ASCII, allowing that good old portability (RFC 3629). ASCII characters remain 1 byte, higher characters expand to multiple bytes.


So with UTF-8, why is ASCII still used? Protocols like HTTP (headers), SMTP commands, FTP commands, and even DNS labels (for the most part) assume ASCII. Some programming languages depend on it as well for source code keywords, identifiers, syntax tokens, and even when Unicode is supported a parser may still assume ASCII control structures.


To answer the question of why ASCII is still used: It was simple, it was portable, plenty of things rely on it, and the encodings that incorporated it also succeeded. It proves that it's not the "best" that succeeds but what actually accomplishes what it set out to do in a way that works for as many as possible. It hit all the key points it needed to and still lives on.


Saturday, January 3, 2026

History of IT: Why a byte is 8 bits


While at work, I was helping to explain subnetting and octets. In the process, I realized there is some disconnect of what a bit is, what a byte is, and why a byte is 8 bits. So let's run through the short history of why a byte is 8 bits!

In early 1955, begins the IBM 7030 "Stretch" projects. This is where one of the places where character-sized chunks of data gets engineered into architecture. I was meant to be a super computer and has a lot of history on its own, but let's focus on the data architecture specifics. There are fixed point numbers store in in binary of 1 to 64 bits, or decimal in 1 to 16 digit chunks. Decimal format numbers have a length of 1 to 8 bits. Floating point numbers have a 1 bit exponent flag, a 10 bit exponent, a 1 bit exponent sign, a 48 bit magnitude, and a 4 bit sign. Alphanumeric characters can use 8 bits or less.

On June 11, 1956, the word Byte appears in the IBM Stretch-era documentation. Werner Buchholz is credited with coining the term "byte" during the Stretch design period. The term "byte" just referred to a group of bits, and in the case of Stretch they had a variable length. The rationale for why the term "byte" was used as opposed to "bite" was to avoid confusion because bit and bite are too similar.



From 1956 to the early 1960s, the size of a byte varies by machine. During this time, a 6 bit scheme was common for alpha numeric characters. That's not to say the 8 bit byte didn't exist, it just was not a standard, so you could do whatever.

In 1962, ASCII published a standard for a 7 bit code. Work on this is dated to an ASA meeting on October 6, 1964. This allows for 128 unique codes, allowing for alphanumeric characters and some control characters.

In 1963 to 1964, Extended Binary Code Decimal Interchange Code (EBCDIC) is created, using 8 bit character encoding for a byte. On April 7, 1964 was the launch of the IBM System/360 and helped announce EBCDIC. IBM touts System/360 as what pioneered the 8 bit byte becoming dominant,. The documentation of the system describes a byte as being 8 bits with a 1 bit parity check.

As we get into the 1970s, 8 bit microprocessors start to spread, making the 8 bit byte more culturally and economically entrenched. This makes a lot more devices assume 8 bit bytes. Buchholz publishes a letter in February of 1977 in BYTE Magazine, "The Word 'Byte' Comes of Age..." He reflects on the term's origins and that many people assume a byte being equal to 8 bits comes from the System/360.

Now let's get a bit more modern. We come up to September of 1981, the Internet Protocol (IP) uses the term "octet" to mean an unambiguous 8 bit chunk. This is specifically to avoid the ambiguity of the term byte (RFC 791).

Finally, we enter into a time when I was alive, 1993. The ISO/IEC vocabulary standard notes "byte is usually 8 bits." Specifically, ISO/IEC 2382-1 includes the notes that the number of bits in a byte is 8, and defining an 8 bit byte as an octet. Now we're standard. While it did allow a lot of flexibility, keep in mind that to include characters from other languages that exceed what a 8 bit byte can handle, which is only 256 unique values, other character encodings have been created using multiple bytes to include all the alphabets, symbolic languages, and more.

And finally, I had ChatGPT make a visual timeline for me. So enjoy the summary version.