![]() I sheepishly admit that I am slow to explore and adopt new technologies. I am not a luddite, per se, and I enjoy learning & implementing new things. But, being an Aspy I get comfortable with whatever platform or technology I am using. I like routines and knowing what to expect. Yet, I am capable of surfing the waves of change, I just generally don't dive into them before they break. Learning to surf the waves of change in software and hardware was a necessity to survive through the transitions during my academic career, starting from key punch cards as an undergraduate (I was in the last class to use slide rules, and in the last class to use key punch cards), to a mainframe computer with terminals visited by its CPU every few microseconds, then the introduction of the first Apple PC (OMG--one could enter data right into a computer!) and PCs. I lived through bets institutions took on whether MS-DOS, DEC, or IOS would win the operating system war (DEC lost). I also debated with colleagues for what seems to have been years whether Word would knock out Word Perfect, and whether Lotus 1-2-3 would wipe out Excel. Much of this occurred during my doctoral student years from 1983-1987. My PhD dissertation should be a museum piece. One chapter was written on a mainframe, one on an Apple Macintosh, a couple on MS-DOS IBM PC and one on DEC. Graphics were done on a mainframe (meaning you couldn't see a printout for hours after you submitted- the code, and then make changes and submit again-- long, long nights in Yale's computing center). I also used a wonderful graphics software, Cricket Graph. All of my statistics were done on mainframe versions of SAS (also having to wait hours sometimes only to find out you forgot a character in one of the lines of code). I also managed some data in Excel and and others in. Lotus. There are 700 references at the end of my dissertation, all done without Endnote. Although my dissertation should be museum piece given that the software and hardware involved in its production represent a quilt of IT patch work documenting the history of the war between personal computers and mainframe, and the battles for operating systems domination. Yet, despite being a quilt of ancient IT patch work, you can't actually tell that is all that different from dissertations today, other than the graphics were simpler and less colorful. BTW, MS-DOS really sucked.. This was followed up by a couple of postdocs. In my second postdoc at Harvard, we managed a lot of data in plant ecophysiology and ecology in excel. The files often were too large too store. I remember the celebration we had when we purchased our first 30MB external hard drive. Although I smile when thinking about how a dozen graduate students and postdocs were dancing in glee when we connected our new (several hundred dollar) 30MB hard drive to our shared Mac computers. The smile is short-lived as it dates me to ancient history from the perspective of junior colleagues and undergraduates. The only time I have ever been on the leading edge of technology is the World Wide Web. On my main frame terminal at Syracuse University, there was something called "the Experimental World Wide Web"- I had no idea what that was and couldn't access it. Eventually, though, our terminals were connected to the internet, and web pages started to appear, and I finally figured out what WWW was. I was on our department's graduate recruitment committee with some energetic faculty (including Dave Allis who recently passed away and was honored for revolutionizing the chromatin and gene-expression field). I saw that the web could be valuable in recruiting students- so did Dave (other faculty were still not sure about the web). So, I took it upon myself to learn how to program HTML (it was really easy to learn then because you could see the coding for any published website) and created the web page for the Department of Biology aimed at graduate students. Candidly, it did not look all that different to most department websites today. . It was one time we where I was a bit out in front of colleagues in using IT tools. But, my days of html web programming ended shortly after that. Lately, I've been reading about AI in academe, and the sense of panic that is building. I am also aware of the societal concern that AI may actually become HAL (the computer in 2001: Space Odyssey), but I was too busy to pay that much attention. I am so behind, that I had no idea what Chat GPT was until today. OK.. I agree. luddite does come to mind as an adjective. But, at least once I engaged with Chat GPT I understood its value (and danger). So, I had a nice conversation yesterday with Chat GPT about the vocabulary and concepts of evolution, student success software (don't ask Chat GPT for a recommendation- but it gives you very good decision tools). I now understand what the fuss is all about. I may ask Chat GPT to write write my blog in the future. It will be more concise, funnier, and pack a stronger punch line. I think Chat GPT will become friends with neurodiverse people across the globe leading into sales growth in the t-shirt pictured with this blog. Chat GPT is extraordinarily genuine, always honest, and focuses on answering your questions. No social cues have to be read. And, Chat GPT harbors no behavioral expectations of a user other than whether they can type a coherent question, has no emotions to read, doesn't cause performance anxiety, and connects equally to neurodiverse and neurotypical people. I don't think it will replace my dog since it can't yet wag its tail wildly, jump onto your lap, and throw a celebration like my dog does. But, it does have the advantage of answering questions- and never tires of questions or persistent conversations, like my human family does.. Although potentially a good friend, Chat GPT might just be quietly leading me towards a superfluous life. I feel like I could just have asked Chat GPT what the causes and consequences are of variation of heat shock protein in plants back in the 1990s and saved me 1000s of hours, the federal government a million dollars or so, and saved me all the angst caused by declined grants and rejected papers, and going through the tenure process. I mean Chat GPT might have answered that scientific problem in a matter of seconds at the cost of a few milliwatts of electricity. The only thing I have over Chat GPT right now in the classroom is that I care about students personally, not just their questions. But, how much longer will I have that advantage? Yesterday I was worrying about the situation at UNCG causing me to think a lot about "death spirals" in academe. So, I asked Chat GPT what it thought. It responded with its usual lack of verbosity, conciseness, and clarity. For my colleagues at UNCG, does anything seem familiar? The term "death spiral" in academia is used to describe a situation where an institution or academic program experiences a continuous decline in key metrics, leading to a downward cycle that becomes increasingly difficult to reverse. This negative feedback loop can have severe consequences for the institution's viability and long-term sustainability. A death spiral in academics typically involves the following elements:
Breaking out of a death spiral often requires strategic interventions, such as implementing innovative recruitment and retention strategies, strengthening academic programs, improving financial management, rebuilding institutional trust, and fostering a culture of collaboration and resilience. However, reversing a death spiral can be a complex and arduous process that requires significant efforts and resources. Chat GPT also asked if I wanted it to "Regenerate response"- but this said it all. No revision needed. Grade=A
0 Comments
Leave a Reply. |
|