How I Became a Developer in Six Months (and Counting....)

What I’ve learned, and am still learning, from my journey into web development so far

Last June, I came up with this crazy idea to learn to code and start a career in tech. I had just finished a one-year post-doc teaching history to college students, and I really had no idea what I wanted to do with my life. I felt burned out searching for jobs in a field that is offering fewer jobs than it had during the 2008 crash, and even fewer jobs that don't require candidates to uproot in return for no more than a one-year contract. I wasn't sure what I wanted to do next, but was curious about coding.

As a teacher, I had already started to wade into the shallows of the digital pool. I had incorporated GIS StoryMaps into my classes, and I was starting to play around with Esri software, in the hopes of starting a career as a GIS Analyst. I had learned that an ability to program was useful for a career in GIS, and I figured that I would probably be learning a little bit of Python at some point. A family friend also suggested I consider a career in software, given my background with learning languages. I had spent most of my life thinking that coding was only for super-smart math nerds with CS degrees and/or their own software companies, but the more I thought about the work that coding requires, the more it seemed to align with the strengths and passions I had cultivated in my humanities background: creativity, research, critical thinking, problem-solving, and persistence. Soon, I was drawn in to the world of code. Instead of being a GIS professional who could do some programming, I realized that I wanted to be a programmer who could do some GIS.

I started to look for a way to learn the skills necessary to make this into a viable career. I glanced at coding bootcamps, most of which offered grandiose promises of instant employment upon finishing, but also found FreeCodeCamp (FCC) which offered (as I saw it) similar prospects for someone with enough self-motivation. Through FCC, I also happened upon an article about landing a developer job in four months. Having recently earned a Ph.D., I figured I had enough ability and drive to achieve this goal easily. I told myself that I'd plow through the FCC curriculum, and by early 2019 I'd be living the sweet, sweet life as a junior-level web developer! Maybe I'd even be writing my own article about how I managed to get my first developer job!

It's been six months since I made that decision. As you may have guessed, "the job" hasn't come yet. The truth is, teaching myself to code is the hardest thing I've ever had to do. Yes, including earning my Ph.D. Graduate school was no picnic, but I benefited from the structure of the school year, knowing what I had to do, when I had to do it, and what I was working toward in the long run. Plus, I was funded, so even if I wasn't making the big bucks, I had a regular source of income. Conversely, over the past six months, I have had to provide myself with the structure that was automatically there for me in graduate school. At the same time, I haven't had the regular (if modest) revenue that my graduate stipend (or later, my one-year post-doc salary) offered. I've often felt like I've been finding my way in the dark.

Yet even though my self-taught developer plan has turned out to be longer than I had intended, I am still loving coding, and I'm still keeping on the journey. I've already learned lots, not just coding skills, but also more about my own motivations, challenges, and aspirations. I thought this would be a great time to share some of my insights.

These are the pieces of advice I would have given myself from six months ago. Perhaps they'll be the same pieces of advice I'll need to remember six months from now. If you're thinking of taking up coding, perhaps these will be of use to you, but I think a lot of this applies to any sort of major career transition. Everyone's experience is unique, though, so only take what helps you! 😉

1: It will take as long as it will take.

It's easy to read bootcamp ads that say things like "Go from [insert crappy job] to Software Engineer for Google in 8 weeks!" and think that, with enough self-motivation and determination, you can achieve similar results in the same amount of time (give or take a few weeks). And if you can, hey, that's awesome! But there's no set time to "learn code," particularly since most every developer I've ever talked to, regardless of seniority, has remarked that they are still learning. Now I already knew that pursuing work in tech meant that I'd always be learning, but it's more than that: you can't even put a time-frame on learning enough code to land a bottom-of-the-ladder entry-level developer job. If you keep working at it, it's probably more likely to work out than not. But you can't say whether this will happen in eight weeks, eight months, or eight years. (Let's hope it won't be eight years, though!)

2: Take care of your life, then take care of your code.

At the moment, I'm in a relatively fortunate position, since my wife has a full-time job with benefits. So while I'm trying to build a new career, I don't have to worry about all those pesky concerns like healthcare, food, rent, bills, etc. I'm willing to acknowledge that I'm in a very privileged position, and I'm grateful for it.

Here's the thing, though: even in my relatively comfortable situation, I've found that I still constantly worry about money. Yes, we have enough for now, but my wife's salary isn't enough for us to last forever on just her income, particularly if we want to think about long-term things like a family and retirement accounts (which seem to be getting more expensive by the minute). On top of all this, I've been spending a good deal of time, money, and energy managing chronic pain (a relic of my constant reading and typing in graduate school) and caring for an aging cat.

If you want to learn to code, especially if you're going the self-taught route, you'll have to deal with the Catch-22 of job hunting in today's market: you need experience to get money, but you need money (and time) to get experience. The quickest path to a job may be moving to a tech-heavy metro area and immersing yourself in an intensive 8-week bootcamp where you code for 12 hours a day. If you want to do this, go for it. But first and foremost, you need to be honest with yourself about what you need in life and what the best plan of attack is to get it. If your finances, your family life, and your health can't afford such an intensive program, you need to do the best with what you have. This may involve taking on a part-time job somewhere to help bring in money and to give you structure while you learn. It may even involve committing to work full-time and devoting evenings to learning code.

Everyone will have a different balance, and I'm still working on mine. But for me, it's been essential to remember that coding is just one element of a whole variety of factors that bring my life satisfaction. Work hard, code lots, and be persistent, but don't forget that you're more than your code.

3: Listen to yourself before anyone else.

If you're going to learn to code, it's essential to remember why you're doing it. This is a fun path, but it is also a very difficult one. Constantly learning means constantly feeling like I don't know what I'm doing, and it's incredibly easy to freak out and/or burn out. To keep moving forward, it helps to remember why I like coding: the creativity of building things, the mental challenge of puzzle-solving, and the opportunity to make more money.

I've also found it important to acknowledge that my interests develop as I learn more. I didn't start graduate school knowing exactly what I wanted to write for my Ph.D. dissertation, and I shouldn't expect myself to know exactly what I want to code for the next 3-5 years of my life. When I started on this path, I wanted to learn Python and get into data science, in order to follow my GIS interests. Over time, I've found myself more and more drawn to front-end web development, because I enjoy the process of web pages and apps. I also enjoy thinking about aesthetics, so I'm increasingly curious about UX/UI design.

There are a lot of different skills to hone, and a lot of different projects to work on. Thinking honestly about what you want, and listening to that little voice inside your head, will help you figure out how to follow your passions in code.

4: Get organized!

For me, the last year has been filled with harsh truths. Perhaps one of the most shocking truth for me has been that I'm not nearly as well-organized as I'd like to be. As a teacher, I was always able to keep track of records on excel spreadsheets, and I've managed to stay on top of deadlines well enough through most of my life. In code, however, there are so many different things to look at, so many shiny objects to capture my attention, that it's been incredibly difficult to determine how best to manage my time and energy in order to achieve short-term and long-term goals.

There are a few apps that I've found to be particularly helpful in organizing my tasks and projects. I track how I'm spending my time with Hours, and I've recently signed up for Trello in order to create boards for various projects, learning goals, people to contact, etc. Pocket has been a great resource for the hundreds of websites, apps, and articles that I find: instead of keeping 20 browser windows open at once, I can save pages with specific tags, so I can look at them later.

I imagine organization will always be a place for me to improve. With so much to do, and so much to learn, and so many sources to go, it's particularly essential for learning to code.

5: Get Out There!

Every piece of advice I've read about finding a job, regardless of field, talks about the importance of networking. I used to always groan at this kind of advice. As an introvert, I'm most comfortable on my own, or with people I know, and going out and meeting new people has never been my strong suit. I think that is part of what appeals to me about coding: the work allows you to spend a lot of time in your own head, trying to figure out the best way to design, build, and debug for a project.

Over the past six months, though, my biggest motivator has been interacting with other people. I love being in my own head, but if I had spent the last half-year sitting at my desk working on algorithms all day, I probably would have burned myself out.

If you want to learn to code, particularly if you're not doing it through a formal degree program or bootcamp, I can't tell you how essential it is to find people to help you on your journey. The online communities of FCC, dev.to, and CodeNewbie are great, and I recommend following them. Even more important, in my experience, is finding people to talk to in person. Look for meetups in your area. Personally, I've been fortunate that there are welcoming communities in my hometown of Knoxville (huge shout-out to Code Connective and KnoxDevs!).

Everyone says that networking with people is a crucial part of landing a job. I hope this will also be the case for me, but in my experience so far, networking is a lot more than just a means to an end. Meeting with other developers has given me invaluable access to feedback, collaboration, direction, motivation, and even commiseration, as I have pushed forward on this path. I may be a self-taught developer (in that I don't have a CS degree and haven't been to a bootcamp), but I'm definitely not doing this alone.

I hope some of this has been useful to you! If you're in a similar situation, whether learning to code or on another career path, what things have you learned that help you move forward?


Academics, Coding, and Learning How to Learn

In May 2009, the semester before I started graduate school, I attended a conference on Late Antiquity, the field of history I intended to study for my doctorate. I was excited for the conference, because I thought that I would be familiar with the subjects of the talk. After all, I was a star student in my undergraduate program, getting As in Latin and Roman History courses, and easily being able to memorize what I thought were the key components of studying ancient history, such as emperors’ ruling dates, key battles, and the names of various barbarian confederations.

Very quickly, it became apparent to me just how much I didn’t know. Sure I could rattle off key dates, but the level of specialized knowledge of individual texts, scholars, and methods of research that the conference presenters shared was way over my head. I felt like I had just moved cross-country to devote the next phase of my life to something I had absolutely no grasp of. I was overwhelmed, an impostor, and a fraud (or at least that’s how I felt).

Of course, since then, I have become more familiar with the field of Late Antiquity. In graduate courses and my own projects, I learned much more of the specialized knowledge and skills that had seemed so foreign to me at that first conference. I’ve researched and written about a specific area of late antique history, and have written a dissertation that has proven to a community of scholars that I have enough of a grasp on the field to deserve a doctorate.

Here’s the thing, though: all that knowledge isn’t the most important part of my experience. Neither is the dissertation that I produced from it. Even after studying for nearly a decade, there are a lot of things about Late Antiquity that I still don’t know. I can’t tell you, off the top of my head, what the average fourth-century peasant ate for breakfast in rural Cappadocia, or how bishops in small towns in north Africa conducted business within the imperial bureaucracy, or how widely classical Greek medical theories were understood among educated fourth-century Christians.

The most important thing I learned in graduate school is how to embrace the state of constantly learning.

This is an important lesson for me to remember now, as I have taken up coding in order to pursue a career in web development. Many times over the past few months I’ve thought back to that first conference, when I felt so out of place. Starting programming, there are just about as many things to learn as grains of sand on the beach. Even narrowing things down to one specific programming language, like JavaScript, leaves mountains of concepts, tools, and methods that are necessary to learn. What’s more, conversations with programmers have taught me that most people don’t devote their careers to one particular language, and always end up learning new things on the fly as they start new jobs and take up new projects. Learning a programming language involves not just learning the language itself, but learning the language of the language: all of the technical phrases and jargon that people in the field tend to take for granted. Like I did with my history studies, I have to be patient with myself as I learn these languages.

Even as I write this now, I realize something that I wish I had known back in 2009: knowing how to learn is far more important than knowing particular facts. As I do my daily coding, I often find myself frustrated that I can’t get my code to do what it’s supposed to do, and sometimes (but only after struggling through it myself) I start searching StackOverflow and FreeCodeCamp forums for the “answer” to the problem. When I find that “answer,” it usually ends up raising a number of further questions: why does a certain method work? Why does it work on strings, but not integers (or vice versa)? Why did my first attempt to solve the problem not work? How can I remember this in the future and apply it to similar problems? I know more about coding today than I did three months ago. What’s more important, however, is that I know what I don’t know, and I am beginning to know what kinds of questions to ask.

Do I still have moments where I start freaking out, asking myself why the hell I thought it would be a good idea to leave my comfort zone and learn to code, and wondering if anyone would ever actually pay me to do this? Definitely. I certainly hope that as I continue, I will have fewer of those moments, but the truth is, I think they will continue for quite some time. What is more comforting to me is the knowledge that I’ve had these moments before. Whenever I encounter a new coding problem, read a blog about something completely foreign to me, or attend meetup where I’m not even sure that everyone around me is speaking English, I like to remember that I’ve been in this situation before. The most important part of my graduate education is that I’ve learned how to accept and embrace it.

Ancient Languages and Computer Languages

How does an ancient historian decide to start learning to code?

The short answer: it’s all about language.

For me, it all started to click this past summer, while I was back in Colorado visiting family. I’d finished a one-year postdoc, and begun to think about what to do next with my life. For the past year, I had been growing less interested in pursuing academic research, and I was ready to move on to a new chapter. I’d been thinking that I needed to learn some programming basics as part of my budding interest in GIS, but I had not seriously thought of programming as a career path. To me, it was a tool that I needed to learn if I wanted to be, say, a GIS Analyst for a local municipal government.

One evening in Colorado, I had dinner with my parents and some of their friends. The day after, one of my parents’ friends (who has regrettably passed away recently) called to tell me that he thought I might want to try to pursue a career in coding, given my interests in technology and language, combined with the problem-solving skills I built earning my Ph.D.

Normally, I politely nod and turn away career suggestions from family and friends of family (“Oh, my nephew Johnny is the director of a museum. Why don’t you apply to be a museum director?”), but the more I thought about coding, the more it made sense to me. The truth is, I have always found language to be the best part of studying ancient history. I have a knack for learning languages and understanding their structure. It’s not that I learn instantly: drilling new grammar and vocabulary has always been a challenge for me. But it’s a challenge that, somehow, makes sense. When learning ancient languages like Latin and Greek (or, indeed, modern languages like French and German), there’s a structure to the language, demonstrated by rules that determine what certain words can do, how they can change, and how they combine. To be sure, language is fluid, and there are exceptions to just about every grammatical construction in any language you find. But there’s also a method to the madness. Languages exist for the purpose of communicating information and achieving functions, and the rules, even if they are flexible, help to provide a framework for communication.

This is a part of coding that has clicked for me. When I see an expression in JavaScript, I can understand that there are rules that govern what the expression does, and what sorts of data it can include. This doesn’t mean that I know all of those rules or the best way to apply those rules to a given situation, but I can see that there is a structure that allows people to communicate in particular ways to a computer, so that the computer will in turn perform certain tasks. Just as Latin uses different endings to connote different values, JavaScript uses different kinds of variables, objects, and arrays to connote different values (and values within values).

Deciphering texts in ancient languages is, in many ways, solving a logical puzzle. To seems to me that writing clear code that people can follow and that machines can interpret is also a logical puzzle. I look forward to finding more ways to solve these sorts of puzzles!

Athens, Jerusalem, and Transitioning Careers

What has Athens to do with Jerusalem? What has the Church to do with the Academy?

  • Tertullian (c. 200 CE)

Like many Humanities Ph.D.s in today’s job market, I am currently transitioning out of academia, having decided that I do not want to continue year after year of putting my life on hold in the hopes of attaining a rare tenure-track position. Last summer, I finished a limited-term postdoctoral lectureship and decided not to continue pursuing the professorial career that I had been trained to pursue for nearly a decade. At the moment, I am building a career in tech, and doing my best to find ways to connect my previous life as an academic nerd with my new life as a computer nerd.

I don’t want to write a sorrowful lamentation detailing all of the things wrong with the modern university, there are several good ones that you can readalready, or you can just google “quit-lit” to get an idea of what people are facing. To be sure, there are plenty of adequate reasons for Ph.D.s to feel bitter about entering a market so dreadful, in which exploitation of contingent faculty is such a widespread problem. But others have written this angle already, and I have nothing new to add.

What I would like to share is how I am approaching this transition from my background as a historian of Late Antiquity, the period of Roman/European/Mediterranean History from approximately 200 to 800 CE (these dates aren’t set in stone, but the broader chronological range offers a more productive model than the traditional “Fall of Rome/Dark Ages” categories for studying these centuries). This was a period of considerable change in Europe and the Mediterranean (and indeed, even beyond). In addition to the fragmentation of western Roman political authority, this period saw the rise of Christianity as a state-sanctioned religion, the development of various Christian theologies in ecumenical councils and episcopal disputes, the establishment of Jewish texts such as the Mishnah and the Talmud, and the rise of Islam in the eastern Mediterranean and the Middle East. Late Antiquity was a time of transition, in which old ideas about Rome, humanity, and the gods (or God) were disputed, discarded, and transformed in different ways and among ever-changing communities throughout the Roman world and beyond its borders.

Education in particular saw significant change during this time period. The first quote I cited above is from Tertullian, a Christian author who lived in North Africa (specifically, present-day Tunisia) around the turn of the third century CE. Tertullian sought to define an ideal Christian by describing what they shouldn’t do — elsewhere, he wrote that Christians shouldn’t go to popular Roman entertainment events such as the theater, gladiatorial games or chariot races. In his famous quote about Athens and Jerusalem, he suggested that proper Christian life was fundamentally incompatible with “worldly” education of the ancient world: what has Athens, the cultural center of the Roman Mediterranean (comparable in its reputation for status and learning to a modern-day Ivy League school), have to do with the holy city of Jerusalem?

There is something to Tertullian’s statement that echoes in my mind as I face this coming transition. No, I’m not leaving academe to join the Church — my motivations are entirely secular. What I find striking is the stark dichotomy between the world of “traditional” education and that of the Christian. As Tertullian quipped that Greek learning was incompatible with Christian living, so I often find, in the culture around me as well as in my own thoughts, ideas that the work I have done in graduate school is incompatible with life outside of academe. Whether through legislative attacks on humanities departments as “useless,” non-academic employers who look for tangible experience over transferrable skills, or my own concerns over whether I am somehow abandoning my “true” identity by exploring new career paths, I often find myself feeling as if there is a sharp distinction between my life as an academic and whatever is coming next. To put Tertullian’s maxim differently, what has a Ph.D. in Late Antique History have to do with the “real world”? (By the way, graduate work is indeed a legitimate job that requires substantial labor — hence the scare quotes around “real world.”)

Of course, not all early Christians shared Tertullian’s viewpoint. For centuries after he died, ancient schools throughout the Roman world taught both Christian students and students who followed traditional forms of worship (i.e., “pagans”). After Christianity became tolerated, and then favored, under the emperor Constantine in the early fourth century, the number of Christians increased among the highest social ranks of the Roman world. Most of these Christian elites received an education that would have looked similar to that of their grandfathers and great-grandfathers, based on the works of Homer in the Greek-speaking East, and on those of Vergil in the Latin-speaking West.

Yet things were changing in the fourth century — with the growth of Christianity came new culture wars, as Christians and non-Christians alike disputed the purpose of traditional education and its role in shaping the young minds of the Roman world. These culture wars even reached the imperial throne in 362, when the emperor Julian — Constantine’s nephew — insisted that teachers must worship the gods they taught about. Under Julian, Christians were (in theory, if not in practice) barred from teaching, since they did not acknowledge the supremacy of the gods of Homer. This policy did not last long, yet it kindled in the minds of educated Christians the same question Tertullian had asked over 150 years prior: what, indeed, did Athens have to do with Jerusalem?

In the major Syrian metropolis of Antioch, a young priest named John used this conceptual divide between classical education and Christian spirituality to speak for his own view of Christianity. John’s nickname, Chrysostom (“golden-mouth”), attests to his reputation for powerful rhetoric in the late fourth century. In one oration, Chrysostom sought to convince wealthy parents not to oppose their children’s decisions to abandon their education in order to seek ascetic life in the hinterland of Antioch. To do so, he lamented the woes of the student in language that graduate students today may relate to:

The child’s lack of ability, the ignorance of teachers, the negligence of pedagogues, the father’s want of leisure, the inability to pay fees and salary, the difference of characters, the wickedness, envy and ill will of his fellow students, and many other things will deter him from his goal. And this is not all, but even after reaching the goal, there will be further obstacles. For when he has overcome everything and reached the pinnacle of education, if he has not been prevented by any of these obstacles, other traps still lie in wait for him. The ill will of rulers, the jealousy of fellow workers, the difficulty of the times, the lack of friends, and poverty frequently frustrate his ultimate success.

I have been incredibly fortunate in my graduate career not to have encountered many of the school obstacles Chrysostom described (my faculty mentors are anything but ignorant and negligent). Yet it’s hard for me not to find an echo of the current Ph.D. crisis in phrases like “the ill will of rulers, the difficulty of the times, the lack of friends, and poverty.” According to Chrysostom, the traditional world of Roman education involved constant risk, with very slim chances of success.

In addition to offering a chilling parallel to the life of a present-day humanities graduate student, Chrysostom’s warning about the dangers of schooling in the fourth century reflects the same sharp divide between traditional education and Christian spirituality as both Tertullian and Julian had envisioned. Similar rhetoric can be found in other contemporary Christian authors. In the 370s Basil, the bishop of Caesarea in Turkey, looked back on his education in Athens as worldly vanity. Famously, in 384 Jerome (later famous for translating the Latin Vulgate) wrote that he had been “a Ciceronian, not a Christian” when he was living in the desert but still missed the refined literature of his earlier school days. Around 397 Augustine of Hippo, when recalling his conversion a decade earlier, described his conversion prompted by the example of the Egyptian monk Antony, who (allegedly) gained spiritual wisdom in spite of being illiterate. Saints lives during this time are filled with examples of monks and bishops who outsmart “traditional” intellectuals like philosophers and orators, thus arguing for the superiority of the Word of God (Christ) over the words of “the world.”

Ironically, though, all of this literature critiquing the intellectual life of Ancient Rome came from authors who had reaped education’s benefits. Basil, Chrysostom, Augustine, and Jerome had all achieved high levels of rhetorical training in the classroom. Basil and Augustine had even spent some time as teachers themselves — perhaps they could be considered Ph.D.s who spent a year or two as lecturers before deciding to leave the academy. The point is, even after these authors “left” their education, it continued to shape the way they thought and wrote. Scholarly studies over the past few decades have shown the ways that Christian authors who declared their departure from “vain” and “worldly” schooling (late antique quit-lit?), continued to draw on the texts and intellectual methods of their student days. Even Jerome, who claimed to have had a vision in which Christ whipped him for clinging to Latin literature, continued to draw on the intellectual treasures of “the world” in his translation and exegesis of the Scriptures.

This is where I find a useful lesson for my own career transition. Whatever I will be doing professionally in the coming weeks, months, and years will very likely not have much to do with Late Antiquity. I’m still interested in the field, and I would like to keep up with ancient history in some way, whether that involves reading and blogging, attending the occasional conference, or even working on minor publications. But even without those things, my studies will continue to form a part of me. Asking, à la Tertullian, what a Ph.D. in Late Antique History has to do with the “real world” fundamentally misses the point. The Ph.D. has changed me, and I would like to think most of these changes have been for the better. Those “transferable skills” that people always talk about in defending the Humanities — critical thinking, clear communication, intellectual curiosity — really do exist. What’s more, education’s formative effects reach well beyond the classroom. Both in antiquity and today, education is far more than transferring skills and knowledge from one brain into another. From kindergarten to graduate school, students absorb physical, social, and cultural values, both consciously and unconsciously. Even if I wanted to, I could not simply detach my life studying Late Antiquity from the complex web of memories, experiences, emotions, habits, and sensations that make up what might (with proper scare quotes) be considered my “self.”

Athens and Jerusalem only represented two diametrically opposed worlds in rhetoric, not reality. It’s good to remember that the same holds true for the phrases “Humanities Ph.D.” and “Real World.”