#ImaginingProgramming #LearningToCode #ProgrammingJourney
Have you ever wondered how computer programs were made before you actually started learning to program? 🤔 It’s a question that many people have pondered as they embark on their coding journey. Before diving into the world of programming, many of us had preconceived notions of how a program was created. Let’s take a look at some of the common misconceptions and how the reality of programming differs from what we imagined.
## Preconceived Notions
When we were younger or before we actually delved into the world of programming, many of us had our own ideas of how computer programs were made. Some common misconceptions included:
1. Magical Process: Some people imagined that programming was some sort of magical process where a few keystrokes on the computer could make anything happen.
2. Complex Wizardry: There was a common belief that programming involved complex formulas and mysterious codes that only geniuses could understand.
3. Instant Results: Many imagined that once a program was written, it would immediately work perfectly without any bugs or errors.
These are just a few examples of the misconceptions that many people had about programming before they actually started learning to code. The reality, as many of us have discovered, is quite different.
## The Reality of Programming
The truth is that programming is a skill that requires dedication, problem-solving abilities, and a deep understanding of logic. Before we learn to program, we may not have realized the intricate processes involved in creating a computer program. Here are some key points to consider when it comes to the reality of programming:
1. Logical Thinking: Programming involves a great deal of logical thinking and problem-solving. It’s not just about typing lines of code – it’s about understanding the logic behind the code and how to achieve a specific outcome.
2. Trial and Error: Contrary to the misconception of instant results, programming often involves a lot of trial and error. Writing code, testing it, and debugging are all part of the process.
3. Continuous Learning: The world of programming is constantly evolving, so learning to program is an ongoing journey. It’s not just about learning a specific language – it’s about adapting to new technologies and methodologies over time.
## The Evolution of Programming
If you had asked someone 50 years ago how computer programs were made, their answer would be very different from what it is today. The evolution of programming has brought about new languages, tools, and techniques that have transformed the way we create software. Some key points to consider in the evolution of programming include:
1. Early Programming Languages: In the early days of programming, languages like COBOL and Fortran were used to write software. These languages were much more limited in their capabilities compared to the languages we use today.
2. Object-Oriented Programming: The advent of object-oriented programming brought about a new way of thinking about software development. This approach allowed for greater reusability and modularity in programming.
3. Advances in Automation: With the rise of automation and AI, programming has also evolved to include tools and frameworks that streamline the development process.
## The Future of Programming
As we look to the future, the world of programming is poised for even more innovation and advancement. The way we imagine programs being made will likely continue to evolve as new technologies emerge. Some key trends to watch for in the future of programming include:
1. Quantum Computing: The emergence of quantum computing could revolutionize the way we write programs, opening up new possibilities for solving complex problems.
2. Augmented Reality: With the rise of AR and VR technologies, programming will likely see a shift towards creating immersive experiences that go beyond traditional 2D interfaces.
3. Blockchain: The integration of blockchain technology into programming could pave the way for new types of secure and decentralized applications.
In conclusion, the way we imagined programs being made before we actually learned to program has undoubtedly changed as we delve into the world of coding. The reality of programming requires a deep understanding of logic, problem-solving abilities, and a willingness to adapt to new technologies. As we look to the future, the world of programming is poised for continued innovation and advancement, shaping the way we create software in the years to come. So, if you’ve ever wondered how programs were made before you started to learn programming, the reality is both complex and fascinating. Happy coding! 🖥️🚀
Before I started I assumed programming was like bashing away at hackertyper for hours rather than desperately googling error messages and shouting expletives at my monitor.
When I first started I thought the most impressive stuff was massive scripts I could barely comprehend. So I thought the role of a programmer was to be this godlike machine that could remember all of intimate details of their 10,000 line function.
Now I have a bit more experience I realise the most impressive codebases are those that make following, amending, and extending complex operations relatively easy via abstraction.
When I was in school I knew how cartoons were made, so I assumed video games were similar. I thought someone had to do each and every frame possible in the videogame and that you pressing the buttons was like choosing your path through a slideshow of those frames.
That was about 40 years ago already, when I was a small kid, back in the age of Commodore C64 and ZX Spectrum. I thought that the software/games were somehow made using larger machines, kinds of like you find in electronics labs and factories. Something along those lines. I.e. not by actually programming the computer itself.
It felt magical when I actually wrote my very first lines of BASIC into my friend’s ZX Spectrum. That was the day I knew I wanted to be a programmer.
I thought it was done with something like powerpoint but a large scale with thousands slides.
As a kid, whenever I played some games on a keypad-phones(in 2005-12 when I was 4-10 years old) , I would see some logos with names like java, unity and thought they were some companies that made those games. Never did I thought they were programming languages Or graphic engines. Fuck, I didn’t even know that there was a thing called “programming”. I Just thought how many powerpoint slides would it takes to make it and how they structured those slides to make show the user inputs.
When I was in elementary school, and heard that the computer’s “language” is binary, I assumed that you have to write programs in binary so that the computer understands them.
I figured you typed beeps and boops in the proper sequence.
Not me because I started programming when I was 10, but a friend of mine came with me to a computer store in the early 80s, walked up to an Atari 800 computer and typed “Basketball”, thinking that’s what it would take to create a game.
I’m actually not sure, probably something like sticking a bunch of pre-made modules together or something. Don’t ask me how the pre-made modules were made or anything, I don’t think I ever got that far.
What really confused me for a long time was how the computer took this long unbroken string of 0s and 1s and made sense of them. Like how did it know where a byte started or ended? Finally what clicked is learning my fundamental model was wrong. It’s not an unbroken string of 0s and 1s, it’s small discrete sets of 0s and 1s, and the meaning ultimately rests with how the CPU is *physically* put together, it’s nothing to do with programming.
The the first part: I wasn’t interested in programming when I started. I needed to fill a credit in high school and that was the only class that fit in my schedule. I was very aware, though, that it was sort of like starting over in English or math class, since I had no idea what programming was; in math I knew sort of what trigonometry was before I took it, but with programming I had literally zero guesses about what it would be. So I found that really interesting—not just learning how to do things, but learning what the possibilities even were.
For the second question, no there’s no fundamentally different way to program. This is the basic idea of the Church-Turing thesis and some results that derive from it—once you hit turing-completeness, there’s no more power computational model. Basically, everything becomes syntactic sugar after you can simulate a universal Turing machine.
I thought each pixel a window had would be driven with code one by one. That I’d be building up the exit button’s reds pixel by pixel.
For what it’s worth, this does actually happen, it’s just abstracted away typically, so things wouldn’t be so tedious. But that logical leap, that it can be abstracted away, wasn’t apparent to me at the time.
It’s still kind of interesting to think how right and wrong I was at the same time.
I didn’t know what html and css were, I thought the frontend of a program would be made in a similar fashion to how something like Squarespace or Wix worked, where you just drag components or shapes and make the frontend.
I didn’t think much code was needed. Turns out, I was wrong.
I thought if you made the application icon that would create the whole program lol