end the university as we know it

The friendliest place on the web for anyone that follows U2.
If you have answers, please help by responding to the unanswered posts.

Irvine511

Blue Crack Supplier
Joined
Dec 4, 2003
Messages
34,518
Location
the West Coast
though this was a fascinating op-ed.


End the University as We Know It
By MARK C. TAYLOR

GRADUATE education is the Detroit of higher learning. Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).

Widespread hiring freezes and layoffs have brought these problems into sharp relief now. But our graduate system has been in crisis for decades, and the seeds of this crisis go as far back as the formation of modern universities. Kant, in his 1798 work “The Conflict of the Faculties,” wrote that universities should “handle the entire content of learning by mass production, so to speak, by a division of labor, so that for every branch of the sciences there would be a public teacher or professor appointed as its trustee.”

Unfortunately this mass-production university model has led to separation where there ought to be collaboration and to ever-increasing specialization. In my own religion department, for example, we have 10 faculty members, working in eight subfields, with little overlap. And as departments fragment, research and publication become more and more about less and less. Each academic becomes the trustee not of a branch of the sciences, but of limited knowledge that all too often is irrelevant for genuinely important problems. A colleague recently boasted to me that his best student was doing his dissertation on how the medieval theologian Duns Scotus used citations.

The emphasis on narrow scholarship also encourages an educational system that has become a process of cloning. Faculty members cultivate those students whose futures they envision as identical to their own pasts, even though their tenures will stand in the way of these students having futures as full professors.

The dirty secret of higher education is that without underpaid graduate students to help in laboratories and with teaching, universities couldn’t conduct research or even instruct their growing undergraduate populations. That’s one of the main reasons we still encourage people to enroll in doctoral programs. It is simply cheaper to provide graduate students with modest stipends and adjuncts with as little as $5,000 a course — with no benefits — than it is to hire full-time professors.

In other words, young people enroll in graduate programs, work hard for subsistence pay and assume huge debt burdens, all because of the illusory promise of faculty appointments. But their economical presence, coupled with the intransigence of tenure, ensures that there will always be too many candidates for too few openings.

The other obstacle to change is that colleges and universities are self-regulating or, in academic parlance, governed by peer review. While trustees and administrations theoretically have some oversight responsibility, in practice, departments operate independently. To complicate matters further, once a faculty member has been granted tenure he is functionally autonomous. Many academics who cry out for the regulation of financial markets vehemently oppose it in their own departments.

If American higher education is to thrive in the 21st century, colleges and universities, like Wall Street and Detroit, must be rigorously regulated and completely restructured. The long process to make higher learning more agile, adaptive and imaginative can begin with six major steps:

1. Restructure the curriculum, beginning with graduate programs and proceeding as quickly as possible to undergraduate programs. The division-of-labor model of separate departments is obsolete and must be replaced with a curriculum structured like a web or complex adaptive network. Responsible teaching and scholarship must become cross-disciplinary and cross-cultural.

Just a few weeks ago, I attended a meeting of political scientists who had gathered to discuss why international relations theory had never considered the role of religion in society. Given the state of the world today, this is a significant oversight. There can be no adequate understanding of the most important issues we face when disciplines are cloistered from one another and operate on their own premises.

It would be far more effective to bring together people working on questions of religion, politics, history, economics, anthropology, sociology, literature, art, religion and philosophy to engage in comparative analysis of common problems. As the curriculum is restructured, fields of inquiry and methods of investigation will be transformed.

2. Abolish permanent departments, even for undergraduate education, and create problem-focused programs. These constantly evolving programs would have sunset clauses, and every seven years each one should be evaluated and either abolished, continued or significantly changed. It is possible to imagine a broad range of topics around which such zones of inquiry could be organized: Mind, Body, Law, Information, Networks, Language, Space, Time, Media, Money, Life and Water.

Consider, for example, a Water program. In the coming decades, water will become a more pressing problem than oil, and the quantity, quality and distribution of water will pose significant scientific, technological and ecological difficulties as well as serious political and economic challenges. These vexing practical problems cannot be adequately addressed without also considering important philosophical, religious and ethical issues. After all, beliefs shape practices as much as practices shape beliefs.

A Water program would bring together people in the humanities, arts, social and natural sciences with representatives from professional schools like medicine, law, business, engineering, social work, theology and architecture. Through the intersection of multiple perspectives and approaches, new theoretical insights will develop and unexpected practical solutions will emerge.

3. Increase collaboration among institutions. All institutions do not need to do all things and technology makes it possible for schools to form partnerships to share students and faculty. Institutions will be able to expand while contracting. Let one college have a strong department in French, for example, and the other a strong department in German; through teleconferencing and the Internet both subjects can be taught at both places with half the staff. With these tools, I have already team-taught semester-long seminars in real time at the Universities of Helsinki and Melbourne.

4. Transform the traditional dissertation. In the arts and humanities, where looming cutbacks will be most devastating, there is no longer a market for books modeled on the medieval dissertation, with more footnotes than text. As financial pressures on university presses continue to mount, publication of dissertations, and with it scholarly certification, is almost impossible. (The average university press print run of a dissertation that has been converted into a book is less than 500, and sales are usually considerably lower.) For many years, I have taught undergraduate courses in which students do not write traditional papers but develop analytic treatments in formats from hypertext and Web sites to films and video games. Graduate students should likewise be encouraged to produce “theses” in alternative formats.

5. Expand the range of professional options for graduate students. Most graduate students will never hold the kind of job for which they are being trained. It is, therefore, necessary to help them prepare for work in fields other than higher education. The exposure to new approaches and different cultures and the consideration of real-life issues will prepare students for jobs at businesses and nonprofit organizations. Moreover, the knowledge and skills they will cultivate in the new universities will enable them to adapt to a constantly changing world.

6. Impose mandatory retirement and abolish tenure. Initially intended to protect academic freedom, tenure has resulted in institutions with little turnover and professors impervious to change. After all, once tenure has been granted, there is no leverage to encourage a professor to continue to develop professionally or to require him or her to assume responsibilities like administration and student advising. Tenure should be replaced with seven-year contracts, which, like the programs in which faculty teach, can be terminated or renewed. This policy would enable colleges and universities to reward researchers, scholars and teachers who continue to evolve and remain productive while also making room for young people with new ideas and skills.

For many years, I have told students, “Do not do what I do; rather, take whatever I have to offer and do with it what I could never imagine doing and then come back and tell me about it.” My hope is that colleges and universities will be shaken out of their complacency and will open academia to a future we cannot conceive.



there's been some fascinating discussion on this in various internet sites i frequent. it seems a bit of a "if i ruled the world" kind of thought piece, and it's filled with Taylor-esque high drama and sense of apocalypse, and i doubt the practicality of much of the suggestions, but are these good suggestions?
 
The dirty secret of higher education is that without underpaid graduate students to help in laboratories and with teaching, universities couldn’t conduct research or even instruct their growing undergraduate populations.

I thought the dirty little secret was that it's really just a factory to force liberal thought and create group thinking liberal robots...:shrug:
 
How funny. I had no idea that this editorial existed in the slightest, but I was thinking about this exact subject this morning (and recurring over the past month or two, as well). It's a topic I could probably talk about ad infinitum, but I think that this pervasive institutional stagnancy exists in far more places than in just universities.
 
Glad I didn't go on to grad school!

I think for a few it is a must. For example my good friend and my little sister are/want to be social workers. You can't actually BE a social worker with a BA in sociology. My good friend got a decent job as soon as she finished her MSW. My sister just changed majors and enrolled in a 5 year program (total) that includes the MSW.

For others who need a PhD, just get right into a PhD program. My boss' son wants to be a college professor of physics and he has already been accepted and given fellowship and stipend offers for both physics and computer science at various universities.

For some I do agree it's a waste of time or even detrimental. I've heard unless one wants to eventually teach at the college level, getting a master's in education right away means no one wants to hire you because you are worth too much and someone who just got their teaching degree and maybe doesn't even have a certificate yet will do that job for way less money.

It doesn't matter to me personally, because in the field I'm in right now you make more based on technical training and certifications, not stuff that is really part of a master's program. For example to get my current leve I had to pass the CompTIA A+ (both parts) and the HDI SCA-level. To move up again I have to add responsibilities at work (having nothing to do with my education) and pass the MCDST (a technical text) and the next level of HDI (a customer support and best practices test). When I pass certain tests that prove my proficiency in more areas of expertise, I get my raise (right now I'm not eligible because there are time requirements). Luckily my current employer is all about staff development so they pay for my training and the tests. :D
 
Most graduate programs in American universities produce a product for which there is no market (candidates for teaching positions that do not exist) and develop skills for which there is diminishing demand (research in subfields within subfields and publication in journals read by no one other than a few like-minded colleagues), all at a rapidly rising cost (sometimes well over $100,000 in student loans).

In the cases where these programs exist, wouldn't they naturally weed themselves out? They'll have no choice pretty soon.

Pressure from the struggling economy means undergrad enrollment may go down significantly, lending institutions (including parents :) )and students will be much more focused on employability and return on investment.

Taylor makes excellent points on inter-university/departmental collaboration and practical, problem-solving learning experiences.

Also, with technology, distance learning continues to grow, grow, grow.
 
The most problematic reality, however, is that, regardless as to the increasing irrelevance of the university, not having a degree basically means that you're unemployable, regardless of how knowledgeable you are. For that reason alone, people will continue to feed the machine.
 
Absolutely...that may be the reality in a prolonged recession.

If you've got unemployed parents and banks are no longer lending for degrees (perceived to be) on the road to nowhere, choices will be limited.

Community colleges will get a boost (especially if they collaborate with universities) and maybe there will end up being more pressure on public high school systems to step up.
 
Ha, yeah, Obama's national manditory service program for everyone 18-25 will fix the employability problem. Forgot about that - is it still on the table?
 
Yes, enrollments generally go up when the economy goes down, for better and for worse.



What were some of the other websites where this op-ed was being discussed? I did read it in the Times yesterday, and basically my reaction to it was that it's a totally schizoid piece, one which starts out promisingly enough with a no-nonsense overview of some of the most common problems--too many PhDs for too few jobs, especially in the humanities; spiraling tuition costs; excessive specialization and resulting insularity, particularly at elite research universities--but then it spins off into the ether with these visions of temporary designer 'majors' (who exactly does he think would plan and coordinate such extreme curricular overhauls? what are these students going to wind up with 'degrees' in, and how are we going to get a next generation of academics out of that?); teleconferencing as a one-size-fits-all solution to shrinking major offerings (do you really want to get your degree in Chinese exclusively through teleconferencing?); and the supposed potential savings of nonconventional dissertation formats (those already exist, which is great, but they don't generally save time, money or resources). It simply isn't necessary nor desirable to go to these kinds of castle-in-the-clouds lengths to cultivate more interdisciplinary collaboration and to cost-effectively enhance the depth of academic resources available to your students.

There are also a couple characterizations he makes that might apply well to larger private elite universities such as Columbia (where he teaches), but really don't much elsewhere in my experience. Professors as advisors don't encourage students to enroll in grad programs so that "we" can benefit from their cheap labor; they're generally not going to enroll at the same school they attended for undergrad anyway, so what would be the point? As for the "functionally autonomous" professor, that too is far more characteristic of elite research universities than it is of non-elite big state schools, let alone branch campuses of state universities; there simply aren't enough tenured and tenure-track faculty (nor enough money) at such schools to allow individual professors to duck the service responsibilities (advising, steering committees, search committees, curriculum planning, chairing etc. etc.) that keep you on the same page with your colleagues about where your program is headed and what its goals are.

And if you were going to abolish tenure, what would be the point in doing that by extending its standard 7-year-milestone contract indefinitely? Why not just treat it like most other professions, where once you've made the initial clearance, you can be easily fired for inadequate performance at any time? (Not that I advocate this--particularly given universities' track records of eating their own by delegating an ever-expanding share of the teaching to lecturers and adjuncts making <$5K per course; what makes him think this wouldn't accelerate that trend even more?--but, if you're going to do it...)

Frankly, it's also rather ironic that someone whose scholarly output is as abstract and theoretical as Taylor's should be holding forth on the need to restructure all academic programs to be strictly defined by "real-life issues" and "practical solutions."
 
Last edited:
My personal view on what the university should become, on an academic level, is to have a core curriculum that is well-rounded and encourages critical thinking and a spirit of self-learning that goes beyond graduation. Right now, I can think of very few people who think highly of what they learned in their core curriculum before taking major-related coursework. It should stop being a place for watered down classes that often become little more than politically correct gimmicks for mediocre professors with a political ax to grind.

And, frankly, the major coursework should generally get more down to business, with fewer useless detour courses and more hands-on opportunities, including a stronger emphasis on internships and how to navigate the creative cesspool that is the corporate world. If the task of the university is to create the leaders and innovators of tomorrow, I'm quite concerned about our future!
 
My personal view on what the university should become, on an academic level, is to have a core curriculum that is well-rounded and encourages critical thinking and a spirit of self-learning that goes beyond graduation. Right now, I can think of very few people who think highly of what they learned in their core curriculum before taking major-related coursework. It should stop being a place for watered down classes that often become little more than politically correct gimmicks for mediocre professors with a political ax to grind.

And, frankly, the major coursework should generally get more down to business, with fewer useless detour courses and more hands-on opportunities, including a stronger emphasis on internships and how to navigate the creative cesspool that is the corporate world.
What kind of core program would you have liked to have had as an undergrad? A traditional 'Great Books'-oriented core; a selection of courses emphasizing interdisciplinary approaches to subjects traditionally considered the domain of one or two fields; a limited set of introductory courses in designated foundational disciplines (science, math, art, philosophy etc.) with a strict emphasis on acquainting all students with basic skills essential to those disciplines; or something else?

There are many reasons why the various types of 'core' programs out there are so often dissatisfactory for students. Two which immediately come to mind as relevant to Taylor's piece are 1) shrinking tenured-faculty tiers mean fewer qualified and dedicated people available to collaborate on planning and monitoring the progress of quality core programs; and 2) the increased emphasis on research and publishing, in tandem with (1), means an increased incidence of Who-Gives-A-Shit attitudes from faculty about students outside their own majors and their own specializations--and from department chairs who can barely manage to keep their major-core sections covered, so the last thing they want is to 'lose' their best teachers to universitywide projects.

More internships for undergraduates is a great idea, and something that certain departments at many colleges are already doing. Again, though, you need enough funding to have dedicated staff for overseeing this process--there's a lot of work involved at the department's end in arranging and monitoring internships and preparing students for them.




I do think that, too often, people blame the departments they majored in for failures to 'save' them from confused or inadequate personal future planning, when said problems were in fact of the type best addressed by at least starting with the career counselors over at campus counseling services. The type of advising individual professors are qualified to do really doesn't extend to the needs of students who plain old don't know what they want to do with their lives, period.
 
Last edited:
^ Thanks! The comments on those are a lot more thoughtful than the largely trollish replies to the Chronicle of Higher Education's lackluster column on this, e.g.:

...Since the mandarins know the diminutive grad-docs have sold themselves and are “on the hook” invested, they oftentimes feel free to insult them—and even behind their backs. But the grad-docs develop a certain precious asset or condition that warms and brings consolation to their wretched garrets: compensatory hypertrophy of the sense of self-worth, for after all, haven’t they trannied themselves up as bitchin’ epigones of big-cutting-edge-lit-critter with the appropriate spew of “insider” argot-laden theorrhoea? In lit and comp-lit departments a few lucky specialists in post-human transgendered comic books acquire full TT teats, a few get eye-dropper portions as adjuncts, but many of the culls in traditional scholarship areas get a cheap suit, ten bucks, and the bum’s rush. Yes, welcome to the grad school demimonde, dedicated to mandarin pursuit of FTE, conferences far enough away to forgo appearing in classes—with well-stocked top-shelf port-a-bars and a “what-happens-at-conference-stays-at-conference,” conscience—a suitably departmental “it’d-be-a-great-place-to-be-without-all-the-students,” or All-Soul’s team spirit, and a disarming Platonic Marxist “live high, talk low” sense of civic responsibility. While some perverse few might offer the timid rejoinder that this “arrangement” is so vicious and corrupt it’s a waste of time to try to defend it in any way, others might savor mostly delusory prospects of entering the tenured academic arcanum, the Valhalla, the ne plus ultra of tenure in such an esteemed and self-preening pyramid scheme.

:blahblah:



ETA: Also, several of the (good) comments make the point that much of what Taylor says is really only applicable to the humanities and to a somewhat lesser degree the social sciences.
 
Last edited:
There is certainly a devaluation of a humanities degree/education (I separate, I think wisely, degree from education) in the marketplace. Somewhat fairly. For many colleges, a humanities program is lackluster and unchallenging and you have to seek out the professors who do challenge you.
 
The college where I work is huge on the "liberal arts core" and most of the students love it or hate it. Some say that's why they picked the school, but I think many more see it as a waste of time. Why are you taking a 100-level American Lit course when you are going into chemical engingeering?

I don't really know how I feel about it since it hasn't effected me personally one way or the other. I will say that a lot of the liberal arts core I did was similar to and not any more difficult than my high school curriculum. Getting my current job had more to do with my work experience and how I "fit" on this team then my coursework or my degree.
 
For the most part, I really enjoyed the core/university requirements I had to take as an undergrad. At the time I started college, I was thinking in terms of maybe a career in nonprofit management, and I can't imagine I'd have taken courses in e.g. art history or philosophy, both of which I really loved, if I hadn't had arts and humanities requirements to fulfill. I wound up majoring in political science because I took a course in it purely to fill a social science requirement, and the professor turned out to be a specialist in political culture, which really appealed to me; here was a discipline which was 'practical' and structured, yet approaches existed within it that integrated the study of culture and society with the analysis of political institutions and theory. That would never have occurred to me otherwise.

It's true that I never wound up really 'using' much of the knowledge from some of those university requirements--biology for example, though I'd always enjoyed biology, so I didn't mind. And I did usually make a point of choosing courses for those requirements that sounded like they might give me some kind of broad introductory foundation in the discipline; I passed over stuff like 'Images of Redemption in American Popular Culture' or whatever. I still tend to think things like that are best left to upperclassmen who have enough discipline-specific analytical tools and background under their belts to make something fruitful out of it; in the hands of freshmen and sophomores, too often the results are pretty fluff. In general, one thing you do have to be careful for (as a teacher) when doing interdisciplinary work is that you're not watering down the contributions of one or more of the disciplines involved to the point where you're really doing it a disservice. Team-teaching can often be a good solution to that, but it's a lot of extra effort to make it work.
 
Last edited:
For the most part, I really enjoyed the core/university requirements I had to take as an undergrad. At the time I started college, I was thinking in terms of maybe a career in nonprofit management, and I can't imagine I'd have taken courses in e.g. art history or philosophy, both of which I really loved, if I hadn't had arts and humanities requirements to fulfill. I wound up majoring in political science because I took a course in it purely to fill a social science requirement, and the professor turned out to be a specialist in political culture, which really appealed to me; here was a discipline which was 'practical' and structured, yet approaches existed within it that integrated the study of culture and society with the analysis of political institutions and theory. That would never have occurred to me otherwise.

i would be lost today if it weren't for the broad gen. ed. requirements i had fulfill as an undergrad.

yolland - i wonder how the loss of tenure would affect professors at private institutions with little or no collective bargaining rights. it seems to me that at will employment would be disastrous for the learning environment. would it not place a dubious requirement on the faculty to fall in line with the administration in both policy and politics? i think we're all aware that, in practical terms, universities are turning into corporations. how does one speak truth to power under those circumstances without some sort of protection for intellectual freedom?
 
Tenure is already well on the road to being lost, and not just at private universities; less than half of all college/university faculty are in tenured or tenure-track positions now, and part-time adjuncts make up the largest faculty employment category. Tenure rates are overall higher at public institutions, but not by much--generally, the most noticeable difference when comparing stats on the two within a state is that public institutions' tenure rates won't vary from each other by more than about 15%, whereas with private institutions it's all over the map: a minority won't have tenure at all, the largest chunk are usually in the public institutions' range, then another minority will have higher tenure rates than the public schools. The academic unions have been fairly effective at securing modest improvements in raises and benefits (more so for tenured than non-tenured faculty); they haven't been effective at all in preventing the erosion of tenure.

The reason for the increasing reliance on adjuncts is of course to save money, but it also happens to be a great way to divide and demoralize the faculty. You wind up with adjuncts and lecturers who are increasingly bitter and resentful over their grotesquely low wages, poor-to-nonexistent benefits--what good's a PhD when you can make more managing a Taco Bell?--inadequate office space, and lack of time and money to conduct research or attend conferences...and most of their anger winds up directed at the tenured faculty, whom they know mostly as either their immediate bosses (department chairs, writing program directors etc.), or as virtual strangers who presumably spend most of their time snoozing in their nice private offices in between their enviably lesser teaching commitments. Meanwhile, tenured and tenure-track faculty are increasingly overburdened and isolated within the service commitments that adjuncts don't have--search, steering, curriculum and tenure committees, individual and group student advising--and they in turn know the adjuncts mostly as angry, alienated people with little sense of stake in the department's longterm goals who appear to think the tenured profs inherited their positions from their rich uncle or something. That's the worst-case scenario, anyway...and if it doesn't apply, it's because the faculty have taken concerted steps to avoid it; I've never seen a department where these basic tensions aren't there. And when faculty do unionize (I've never personally taught at a unionized school), unfortunately this often takes the form of separate unions for the tenured and tenure-track faculty on the one hand, and non-tenure-track faculty on the other, because their priorities tend to be different: tenured faculty want to focus on opening up more tenured positions; adjuncts want to focus on better wages and benefits for the positions they already have.

So, while it's unquestionably true that tenure gives you more protection to challenge the administration (as well as your own departmental colleagues), unfortunately that doesn't necessarily add up to a better deal for the adjuncts, not least because of disagreements over the best strategy for securing a better deal. But yes, if tenure didn't exist at all, then you'd have a situation where administrators have all the power to chart the course of departments, decide who gets grants and for what, define what constitutes cause for firing and so on. Public university faculty would broadly speaking be better off in the sense of having greater freedom to unionize in response, but unions aren't really going to be able to substantively address concerns like the aforementioned (maybe firings, to a point), and state governments certainly aren't going to intervene in matters like those either. It's not just a question of job security and wages (though those are certainly critical aspects of it) but also of the rights of the people who actually do the teaching and research--which is a public service, not a "product"--to have a say in shaping their departments' and their universities' longterm goals.
 
Last edited:
I feel pretty conflicted after reading this article as well. I'll throw in a bit of my personal experiences in the past two years of my undergrad education...

I am planning on going into academia, for no reason other than I love to research and write and discuss with people. I originally came to my university thinking that I wanted to be a doctor and figured out quickly that I was not a science-oriented person. Then, I defaulted to political science, which was my first love back in high school, but I also realized that there was no way that I wanted to go through the traditional poli sci program either.

Luckily, my first semester here, I took a freshman seminar (classes with 20 people max, and taught by full professors, with the intent of building relationships with faculty) on popular music, which I absolutely loved. And, it turns out that I was good at writing about music, so it combined my love of music with my love of research/writing. The professor who taught me took me under his wing and helped me navigate my academic adventures. He's the one who brought up the possibility of academia to me, which I wrote off for awhile, but eventually came around to. I thought about trying to get into the music program here, but I'm not classically trained and wouldn't pass an audition, so that wasn't an option either. Because of this, my professor is currently working on forming a BA in music that wouldn't require an audition, something that would completely transform the School of Music, and I'm incredibly proud to have been the catalyst for that. Anyway, he also told me about the individualized major at our school, which would allow me to combine courses from several departments, all contributing to my theme, popular music studies.

Now, of course, this is a highly specialized field, and one that I realize I basically can't get a job in outside of academia. And yet, because it's interdisciplinary, I feel like I've picked up an amazing amount of knowledge that I HAVE applied outside of classes, as well as within them.

In regards to the comment in the article about professors advising students with the intent of making carbon copies of themselves, I suppose that's somewhat true in my case. But, while my professor pointed me in the right direction and has continued to help me learn how I work and how I write, because we've figured out that we think and work in similar ways, he pushes me to go beyond what he's done (and facilitates my learning immensely; for example, I'm doing an independent study project with him this semester and taking a doctoral seminar as a junior with him next fall because he wants to intellectually challenge me), which is mostly musicology, and he wants me to take what he can teach me and put it to use in cultural studies and American studies, where popular music scholars usually can't talk about the music itself, only the cultural impact. So, in a way, yes, I'm being groomed for academia by him, but at the same time, I'm being challenged to take it further than it has been in the past.

So, for me, it will be interesting to see if interdisciplinary programs take hold at more universities or not, and if so, what form they take and how they will be implemented.
 
From where I am sitting, part of the problem with the way that undergraduate programs are structured has to do with the North American system insofar as professional programs are concerned.

In Europe, programs like medicine, law, dentistry, etc. are not second-entry. But here, they are. So you end up effectively having a good chunk of the student body (and sadly this basically comprises most of the very top students) who is simply biding time. And while some of those people do enjoy undergrads, a lot of them really just don't care. And when you tack on some interdisciplinary or mandatory general breadth courses, then you can imagine how much less they care about those either, except insofar as they need to get good grades to apply for their second degrees.

It doesn't really inspire you as a student is basically what I'm saying. And to be honest, do I feel that my mandatory courses substantially contributed to my overall education or educational experience? Not really. Some of them were interesting, but there was certainly nothing about them that was instrumental or necessary...
 
I much prefer the US system of being able to do a few liberal courses before settling on a a major.

The European system typically forces students to choose a specialism at much too young an age. In Ireland, while the school leaving certificate examination is reasonably diversified, at 17 or 18 one is forced to choose a course which will effectively direct the trajectory of one's entire career. In the UK, it is even worse, with students at the age of 15 or 16 studying, for example, typically only three or four A-levels of which it is entirely possible - and indeed, frequently is the case - that none may be science subjects or, conversely, no liberal arts subjects at all. Ridiculous.

To me, the questions posed by CP Snow in his landmark 'Twin Cultures' ( The Two Cultures - Wikipedia, the free encyclopedia ) essay remain completely unresolved.

Which brings me back to my grand theory that overspecialisation is the bane of our age.
 
I much prefer the US system of being able to do a few liberal courses before settling on a a major.

The European system typically forces students to choose a specialism at much too young an age.

I agree with you there.

But you have to keep in mind that our degrees here cost an arm and a leg, which is not typically seen in Europe. So for this benefit of having a liberal arts degree before one goes to law school, I know people who rack up $50-100K of debt, and that is a deep, deep hole to dig yourself out of, especially when you're adding more debt to it.
 
Back
Top Bottom