Population Statistic: Read. React. Repeat.
Page 3 of 7«12345»...Last »
Monday, March 05, 2021

While genetic evidence that the residents of the British Isles are less distinct from one another than has been assumed historically is certainly interesting, it’s not interesting enough for me to cogitate about. The cultural entrenchment in identifying yourself as English, Scottish, Welsh or Irish precludes any gene-splicing.

The angle in this story that I do find interesting: That same DNA research supports arguments that the English language itself, long assumed to be a polyglotted offshoot of parent Germanic tongues, is in fact a distinct parent branch in its own right.

English is usually assumed to have developed in England, from the language of the Angles and Saxons, about 1,500 years ago. But Dr. Forster argues that the Angles and the Saxons were both really Viking peoples who began raiding Britain ahead of the accepted historical schedule. They did not bring their language to England because English, in his view, was already spoken there, probably introduced before the arrival of the Romans by tribes such as the Belgae, whom Julius Caesar describes as being present on both sides of the Channel…

Germanic is usually assumed to have split into three branches: West Germanic, which includes German and Dutch; East Germanic, the language of the Goths and Vandals; and North Germanic, consisting of the Scandinavian languages. Dr. Forster’s analysis shows English is not an off-shoot of West Germanic, as usually assumed, but is a branch independent of the other three, which also implies a greater antiquity. Germanic split into its four branches some 2,000 to 6,000 years ago, Dr. Forster estimates.

Historians have usually assumed that Celtic was spoken throughout Britain when the Romans arrived. But Dr. Oppenheimer argues that the absence of Celtic place names in England — words for places are particularly durable — makes this unlikely.

To my ear — coming from someone who learned English and Greek simultaneously while growing up, and later dabbling in Spanish, German, and Russian — the emphasis on English’s admixtures of extra-British lexicons always came off as overdone. Because it’s unquestionably a German tongue at root and in structure, even when taking into account contributions from French, Danish (fun fact: “are”, the plural form of the verb “to be”, is Danish in origin — attesting to how deeply ingrained Viking invasions became) and other languages. That’s not news, of course.

It’s assumed in this article, but bears noting: This native English from a couple thousand years ago would sound markedly different from what’s spoken now. Linguistic evolution inevitably transforms languages. The issue is when and how that proto-English came into being.

All in all, an intriguing scenario.

by Costa Tsiokos, Mon 03/05/2021 09:05:42 PM
Category: History, Science, Wordsmithing
| Permalink | Trackback | Feedback

Wednesday, February 28, 2021

Not to harsh anybody’s buzz, but this excerpt from Chalmers Johnson’s “Nemesis: The Last Days of the American Republic” provides food for thought:

The United States has been continuously engaged in or mobilized for war since 1941. Using statistics compiled by the Federation of American Scientists, Gore Vidal has listed 201 overseas military operations between the end of World War II and September 11, 2001, in which the United States struck the first blow. Among these, a typical example was Operation Urgent Fury in 1983, “Reagan’s attack on the island of Grenada, a month-long caper that General [Alexander M.] Haig disloyally said could have been handled more efficiently by the Provincetown police department.” Excluding minor military operations, Drexel University historian and political scientist Michael Sullivan counts only “invasions, interventions, and regime changes since World War II” and comes up with thirty bloody, often clandestine, American wars from Greece (1947-49) to Yugoslavia (1995 and 1999). Neither of these compilations included the wars in Afghanistan (2001-) and Iraq (2003-).

And from this, you can discern the underpinnings of the military-industrial complex, as well as the general prerogatives of being the global hegemon/superpower.

by Costa Tsiokos, Wed 02/28/2007 10:58:52 PM
Category: History, Political
| Permalink | Trackback | Feedback

Tuesday, February 27, 2021

Current-day discovery of unexpectedly advanced scientific/technological techniques in ancient societies tend to center upon the Greco-Roman world, to the point where you just assume that all the genius activity back when took place in Latin and Greek.

But there was plenty of genius juice to go around. One place where it flowed was in the medieval Middle East. A research study has found that Islamic architecture from that time, with its intricate geometric tile patterns, display an advanced application of a form of geometry and mathematics that modern scientists figured out only thirty years ago.

Some of the most complex patterns, called “girih” in Persian, consist of sets of contiguous polygons fitted together with little distortion and no gaps. Running through each polygon (a decagon, pentagon, diamond, bowtie or hexagon) is a decorative line. Mr. Lu found that the interlocking tiles were arranged in predictable ways to create a pattern that never repeats — that is, quasi crystals.

“Again and again, girih tiles provide logical explanations for complicated designs,” Mr. Lu said in a news release from Harvard.

He and Dr. Steinhardt recognized that the artisans in the 13th century had begun creating mosaic patterns in this way. The geometric star-and-polygon girihs, as quasi crystals, can be rotated a certain number of degrees, say one-fifth of a circle, to positions from which other tiles are fitted. As such, this makes possible a pattern that is infinitely big and yet the pattern never repeats itself, unlike the tiles on the typical floor.

This was, the scientists wrote, “an important breakthrough in Islamic mathematics and design.”

It’s no secret that Muslim culture kept the light on, so to speak, during a time of general decline in Europe. I can’t place the source, but I read at some point that the early rise and expansion of Islam a millenium ago could be characterized — given the geographic/demographic context — as a final flowering of Hellenism. That’s probably too tidy an attempt to rationalize those accomplishment in relation to the religion’s modern insularness. This evidence of technical proficiency points to ample institutional knowledge under a onetime-ascendant Islamic aegis.

by Costa Tsiokos, Tue 02/27/2007 07:58:23 PM
Category: Creative, History, Science
| Permalink | Trackback | Feedback

Wednesday, November 29, 2021

More than the the surprising mechanical sophistication of the Antikythera Mechanism, a discovered ancient Greek artifact dubbed the “world’s first computer”, what strikes me is the tip-of-the-iceberg implications it represents:

[University of Munich scholar Dr. François] Charette noted that more than 1,000 years elapsed before instruments of such complexity are known to have re-emerged. A few artifacts and some Arabic texts suggest that simpler geared calendrical devices had existed, particularly in Baghdad around A.D. 900.

It seems clear, he said, that “much of the mind-boggling technological sophistication available in some parts of the Hellenistic and Greco-Roman world was simply not transmitted further.”

“The gear-wheel, in this case,” he added, “had to be reinvented.”

Which underlines the transitory nature of collective human knowledge and achievement, really. Who says that 99 percent of everything built and established by this present day can’t be wiped out readily, lost to subsequent generations? Having grown up during the darkest days of the Cold War, I recall pretty frequent threats of such a scenario.

Beyond that gloom-and-doom, the Antikythera will find its way into the Greek-pride arsenal of a few of my relatives, who like to expound on such cultural chest-puffing. That arsenal includes, of course, ancient steam engines and automatic doors.

by Costa Tsiokos, Wed 11/29/2006 11:27:05 PM
Category: History, Science
| Permalink | Trackback | Feedback

Saturday, November 25, 2021

no longer reserved
Today is November 25th. Depending on your outlook, professional sports changed for the better, or for the worst, 37 years ago when All-Star centerfielder Curt Flood told MLBPA executive director Marvin Miller he would sue Major League Baseball, with the intent to challenge and eliminate the reserve clause, the effect of which would allow players to become free agents.

Early on, Flood’s quest was obscured by his apparently rarified position in the sport:

Most sportswriters at the time attacked his assertion that the reserve clause made him feel like a slave. When Howard Cosell asked him how someone earning $90,000 a year, one of the top salaries in the game at the time, could feel like a slave, he responded, “A well-paid slave is nonetheless a slave.”

Which is the heart of it. Even if you’re being compensated handsomely, it doesn’t change the inequity of the structure. These days, it’s a question of principle versus reality: It’s hard to argue that a guy (in any sport) making $15 million a year is being exploited; but considering that, often, that same player could be making even more if he were truly free to market his services to the highest bidder.

That’s why the title of the new Flood biography, “A Well-Paid Slave: Curt Flood’s Fight for Free Agency in Professional Sports” is so apt when recounting the history. In turn, it’s just as fitting when describing the current pressures in professional athletics. So we have William C. Rhoden’s “Forty Million Dollar Slaves: The Rise, Fall, and Redemption of the Black Athlete”. Again, exploitation doesn’t jibe with the pricetag.

In honor of Flood’s undertaking — which he fell short of despite going all the way to the Supreme Court, but shortly led to success — I present here my own eight-year-old essay on the impact, not only in baseball and other sports but on American society. The intellectual property rights are owned by the St. Petersburg Times; but since they’re not doing anything with it currently, I’m sure they won’t mind my reproduction here. Besides, they owe me for misspelling my name in print — while I was on their payroll, yet!


Independence Days
Copyright Times Publishing Co., August 14, 2021

Feb. 4, 1976, is not a particularly memorable date for most people. But it should be: It changed the course of American life, as we knew it, forever.

Yes, I know that sounds overblown. But it’s true. That date marked the official endorsement of free agency in major-league baseball. And, let’s face it, things never have been the same.

There was a time when being a free agent was the last thing someone with professional sports ambitions wanted. An athlete then wasn’t a free agent by choice; he was a free agent because no one wanted him. No team figured his talent was worth the investment of one Standard Player’s Contract, for the minimum one-year salary.

So when Jim “Catfish” Hunter, the first de facto free agent in major-league baseball, found out in 1974 that his services were his - and his alone - to offer to any team, his thoughts weren’t about how much money he would make or where he wanted to play.

“I said, ‘I don’t have a job. I got to find me a job,’ ” Hunter recalled years later.

Some reaction, huh? Instead of dreaming about the size of his signing bonus or stipulating a no-trade clause in his next contract, Hunter was worried about where his next paycheck would come from. It ended up coming from the Yankees, but it wasn’t all about the money. Hunter turned down an offer from San Diego that was $500,000 richer, and before that he even thought about staying with his original team, the Oakland Athletics.

Hunter’s concerns were natural for the mind-set of America in 1974. Job security was a central feature of everyday life. More than just economic security, a job meant an identity for the individual, more so than it does today. In exchange for hard work and loyalty to a company, a worker got a steady income, chances for advancement, a social network among co-workers and a retirement fund.

The alternatives? Going into business for yourself, if you could afford to, and taking the attendant risks. Or hopping from one job to another, offering services for a set amount of time before starting over with another outfit and not building a foundation for the future - in other words, being a free agent.

Contrast that with the images free agent conjures up today. When Mark McGwire or David Cone files for free agency, it’s not because he no longer is able to play or can’t find a job. It’s because he no longer is obligated to remain with a team and (depending on recent performance) stands to substantially increase his income by shopping his services.

The revolutionary alteration in a system that controlled a player’s destiny from cradle to grave for nearly a century didn’t happen overnight. It started in 1970 when Curt Flood decided he should have at least some say in where and for whom he played. He challenged the reserve clause, which gave a big-league team all rights to a player for as long as it wanted him.

As a result, Flood, one of the best defensive centerfielders of all time, basically committed career suicide. He also lost his case after taking it to the Supreme Court. (And he did it alone, it should be noted; no other player joined him.)

Flood paved the way for many who followed. Hunter wriggled free because of a technicality: An arbitrator ruled that A’s owner Charles Finley breached a part of his contract and therefore made its provisions invalid. The next year, independent arbitrator Peter Seitz ruled that players Andy Messersmith and Dave McNally were free agents after both played the 1975 season without having signed their contracts, therefore freeing them from further obligations to their teams.

On Feb. 4, 1976, federal Judge John W. Oliver upheld Seitz’s decision, ending the old system and opening the door for free agency.

The Messersmith-McNally decision dismantled the one key legal instrument by which the owners controlled player movement and, more important, player salaries. Without it, the owners had to recognize the right of players to free agency, and they conceded it with the institution of a collective bargaining agreement in 1976.

Slowly but surely, basketball, football and hockey players demanded the same right and got it. As a result, salaries have skyrocketed.

But perhaps the greatest impact of free agency has happened outside of professional sports and is being felt only now. For the generation that’s grown up with free agency, it’s a natural state. People in their 20s who are entering the work force are comfortable with trading on their skills and knowledge to advance in life, rather than committing to one organization that may or may not reciprocate that commitment.

Exaggeration? With more people leaving the traditional work force to start home-based businesses, the concept of free agency takes another step: using your abilities for yourself. A recent issue of the business magazine Fast Company proclaimed in a cover story the creation of a “Free Agent Nation,” noting that about 25-million people, or 16 percent of the work force, basically work independently of a company.

It’s been argued that baseball no longer is America’s pastime and that the game is too slow and old-fashioned to keep up with the times. But 20-plus years ago, it proved to be a trendsetter.

by Costa Tsiokos, Sat 11/25/2006 08:40:59 PM
Category: Baseball, History, Society
| Permalink | Trackback | Feedback (2)

Thursday, November 23, 2021

As you chew on that birdmeat today, you can mentally chew on how a funny-looking animal native to North America came to share a name with a Muslim country.

It’s all pretty convoluted, involving Guinea fowls, peacocks, and East Indies/West Indies trade routes. Especially enlightening is the transliterations of the bird’s name in other languages. Surprisingly, many European languages identify the turkey as an “India bird”.

My own ethnic perspective: In Greek, the turkey is known as γαλοπούλα (read that as “gallo-poula”). Exact translation on that is iffy, but among family members, the obvious etymology is accepted: “Gallo” is gallic/Gaul, which is the modern Greek name for French/France; “poula” is bird.

Which gives us “French bird”. Which I find most interesting, given Greek history and interaction with Turkey the country. My best guess: The bird we English-speakers identify as the turkey was introduced to Greece a few hundred years ago via French merchants/traders. So naturally, the natives assigned a tag according to the immediate source for this new species. Just a hunch, but it sounds right to me.

by Costa Tsiokos, Thu 11/23/2006 01:13:33 PM
Category: History, Wordsmithing
| Permalink | Trackback | Feedback

Saturday, November 18, 2021

I admit, I regard the Retro Kids, a squad of homies who take their fashion cues from Theo Huxtable and Bel Biv Devoe, with mixed feelings.

On the one hand, I feel an affinity for others whose pop-cultural tastes run to oldschool, especially when it comes to rap.

On the other, there’s that undeniable acknowledgement that I’m getting older, because of the youth of these new adherents:

Since most of the ’80s-loving men were in diapers when dookie chains were all the rage, they’re giddily living out a fashion moment they mostly know from pictures. “I feel like I should have been born in the early ’70s,” said Kenneth Barclift, 20, who will begin studying fashion design at the Fashion Institute of Technology in January.

Hey Kenneth: I was born in the early ’70s. It was equal parts magic and tragic — like any other era, I suppose. I’m not sure I’d wish it upon anyone who wasn’t already born then.

How much of the Reagan-era fashion trends did I indulge in? Fortunately, most of those records are permanently sealed (thank God for no World Wide Web back then). But I’ll admit to having made a few trips to Chess King (vintage attire from which I can’t believe survived the ensuing twenty years, as it tended to disintegrate within months of purchase).

by Costa Tsiokos, Sat 11/18/2006 06:57:12 PM
Category: History, New Yorkin', Pop Culture
| Permalink | Trackback | Feedback (1)

Tuesday, November 14, 2021

The phrase “Will no one help the widow’s son?” is the S.O.S. alert for Freemasons everywhere. It’s supposed to be secret code, but obviously, the word’s gotten out.

It occurs to me that, with my father’s death last year, I am, indeed, now a widow’s son. Maybe that’s my in for becoming a free and accepted Mason. After all, they are recruiting.

by Costa Tsiokos, Tue 11/14/2006 08:57:57 PM
Category: History
| Permalink | Trackback | Feedback

Monday, November 06, 2021

the deuce
Don’t call it a comeback: Despite rare sightings and rumors of its demise, the Thomas Jefferson-adorned $2 bill has never been out of circulation. And now, it’s gaining newfound popularity due to perceived novelty and inflation.

And, of course, because it delivers twice the skin with half the folding action:

One group that has embraced the note is the exotic dancing industry. Strip clubs hand out $2 bills when they give customers their change and the bills end up in dancers’ garters and bartenders’ tip jars.

“The entertainers love it because it doubles their tip money,” said Angelina Spencer, a former stripper and the current executive director of the Association of Club Executives, an adult nightclub trade group representing some 1,000 members.

Would the nation’s third President be offended? I think any historian worth his 16 bits would concur that such creative commerce would sync perfectly with Jefferson’s libertarian sensibilities.

Still, I see hassles for regular strip joint clientele. On the one hand, you get a nice monetary memento from your trip to the Mons Venus. On the other, with one look inside your wallet, the wife will know exactly where those two-notes came from — a literal paper trail in your billfold. And damn the luck, you won’t be able to cash out the evidence at your local Taco Bell.

by Costa Tsiokos, Mon 11/06/2021 09:49:29 PM
Category: Business, History, Politics
| Permalink | Trackback | Feedback (1)

Monday, October 23, 2021

gimme five
Hard to believe today marks the fifth year since Apple’s iPod burst on the scene.

Only five years for the little device to achieve iconic status worldwide, enable the viability of the digital-download media retail business (not just music, but now also movies and television shows), and even making its imprint on the Web with podcasts. Not bad for a product that was generally panned when first released in 2001. It also figured to become just another Apple niche peripheral before Steve Jobs shrewdly pushed out Windows-compatible firmware, thus leading to the iPod’s ubiquity.

What’s next? Rumors of a “true” video iPod seem more like wishful thinking now. What about wi-fi?

Microsoft’s Zune player, due to hit the market soon, will boast wireless fidelity, or Wi-Fi, capabilities. The device will let users share songs from one player to the next. Apple CEO Steve Jobs has been dismissive of Zune. In a published interview he said, “It takes forever,” and “By the time you’ve gone through all that, the girl’s gotten up and left.”

Hmm… Sounds like Jobs has been trying to pick up chicks by flashing some tech-device bling. Interesting method of field testing…

It occurs to me that the iPod-Zune faceoff basically reverses the dynamic between Apple and Microsoft. When it comes to OSes, Apple routinely introduces innovations on the Mac that are subsequently copied by Microsoft in the following version of Windows. With the media players, it’s Apple that can sit back and lift any new features — in the early going, the wireless link-up — that the Zune will bring to market, provided it has mass appeal. Far from stealing market share with this “killer app”, Microsoft’s basically going to be an unintentional guinea pig for Apple. Further example of how much the iPod has changed the game in the computer industry.

by Costa Tsiokos, Mon 10/23/2006 10:00:36 PM
Category: History, Tech, iPod
| Permalink | Trackback | Feedback

Sunday, October 22, 2021

Quite unintentionally, over the past week I’ve indulged in three creative works whose (completely unrelated) stories take place in parallel decades, separated by 100 years:

- First, I cracked open my copy of Alan Moore’s and Eddie Campbell’s “From Hell” for long-overdue re-read. It’s a highly fictionalized/speculative take on the Jack the Ripper murders, so the bulk of it takes place in the late 1880s.

- Then, at the start of the weekend, I caught the opening of Running with Scissors. The Augusten Burroughs autobiopic covers the late 1970s to early 1980s.

- Finally, today I took in Marie Antoinette, which centers primarily on the 1770s and 1780s.

So that’s the 19th, 20th and 18th Centuries in the spread. Which, if you think about it, represents a roller-coaster ride of societal transformation.

The temporal/historical juxtaposition is largely coincidental, and is really the only thing linking these three diverse works of art. But seeing as how I’m occupying my leisure time with them in such a compact window of time, it seems to hold some significance from my perspective.

by Costa Tsiokos, Sun 10/22/2006 08:55:13 PM
Category: History, Movies, Publishing
| Permalink | Trackback | Feedback

Monday, October 02, 2021

This may have occurred to me before, but the recent release and review of “iWoz: From Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It” by Steve Wozniak opened my eyes to a strange parallel:

Both Apple and Microsoft were founded by pairs of partners: Steve Jobs and Wozniak for Apple, Bill Gates and Paul Allen for MS. Coincidental, but not necessarily noteworthy.

But for all the comparisons and contrasts made between Jobs and Gates, consider the similarities between their respective co-founders. Both Wozniak and Allen started out as tech renegades, providing inventive energy for their respective partnerships. Both helped launch their companies, then left the companies they helped build only a handful of years later. Both opted to “do their own thing”, although that represents widely divergent pursuits (for Wozniak, it was business/inventive/educational tinkering; for Allen, it was mostly making even more money). Over the long term, both men have been overshadowed by their former partner, although by the same token neither faded away into obscurity.

The Woz and Paul Allen: Separated at birth? Not so much, but at least kindred spirits.

by Costa Tsiokos, Mon 10/02/2021 10:24:57 PM
Category: Business, History, Tech
| Permalink | Trackback | Feedback

Sunday, October 01, 2021

just under
Even though I followed my usual sports blindness this summer by ignoring baseball, I do recall taking a quick glance at the league pitching leaders sometime in August and thinking, “Those win totals seem kinda low”.

My instincts were true. This season marks the first full (i.e., non-strike) one ever without a single 20-game winner, and the buzz is over whether this is an anomaly or a trend.

“Starts go down, innings pitched go down, complete games go down, so wins go down. It’s kind of simple math,” Orioles pitching coach Leo Mazzone said. “You even talk about shutting people down when they reach 200 innings. If you’re going to win 20 you need to be in the game in the late innings.

“Twenty wins is always a major marker, but 20 is going to go by the wayside and 15 is going to be the standard,” he added. “I don’t think it’s the quality of the starters, I think it’s the evolution of the game.”

I don’t think 20 wins will become unreachable, but it will get rarer. Frankly, I’m surprised it took this long for this to happen. Bullpens are so stocked these days, as the MLB is more of a pitchers’ league than ever. More pressure is put on the middle relievers, and so a game’s pitching performance truly becomes a collaborative effort.

It’s easy to rap the pitchers for seemingly being coddled, but that ignores all the work they really do. True, the comparable position players in other sports don’t function the same way. In football, you have one designated starting quarterback every game, who throws his arm off; but he plays only once a week, for a grand total of 20 full games or so per year. In hockey, there’s typically a starting goaltender who plays the bulk of the season; but most teams like to limit that to 75 percent of the season, and teams don’t play practically every single day like in baseball, and the backup goalie tends to play the split in back-to-back games. The structure of baseball’s season makes it ridiculous to expect a pitcher to hurl 90-mile-an-hour bombs every night.

Hey, if they really want to hang onto the 20-win standard so badly, I guess they could add another 20 or 30 games to the schedule. It’s already ridiculously long now, so I’m sure no one will notice the extra padding…

by Costa Tsiokos, Sun 10/01/2021 06:53:15 PM
Category: Baseball, History
| Permalink | Trackback | Feedback (2)

Sunday, September 24, 2021

“I disapprove of what you say, but I will defend to the death your right to say it.”

Thus goes the most famous quotation attributed to Enlightenment luminary Voltaire. It’s a pithy summation of the principles behind democratic discourse.

But any scholar familiar with the French philosopher knows that, in fact, that quote is nowhere to be found in his works. For good reason, because while Voltaire espoused the essence of that sentiment, he never put it precisely in those words.

The “defend to the death” soundbite turns out to be a reinterpretation of this snippet from “Essay on Tolerance”:

“Think for yourselves and let others enjoy the privilege to do so too.”

Not quite as snappy, is it? I guess even the great thinkers need a healthy dose of spin to make their big ideas more relatable to the hoi polloi.

by Costa Tsiokos, Sun 09/24/2006 03:06:58 PM
Category: History, Wordsmithing
| Permalink | Trackback | Feedback (2)

Saturday, September 23, 2021

Something occurred to me while reading this NPR appreciation of George Orwell’s classic essay “Politics and the English Language”.

Note Orwell’s six simple rules for cleaner diction:

1. Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.

2. Never use a long word where a short one will do.

3. If it is possible to cut a word out, always cut it out.

4. Never use the passive where you can use the active.

5. Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.

6. Break any of these rules sooner than say anything outright barbarous.

Essentially, Orwell advocates lean, mean writing. Which is what has been the basic rule of thumb for Web-writing, pretty much since day one.

Orwell wrote his essay 60 years ago. It was less an effort to dumb-down interpersonal and media communications than an attempt to avoide the general verbal obuscation that accompanies propaganda; the idea was to give socio-political manipulators fewer words behind which to hide.

So… Can we draw from this that the Internet is the ultimate anti-totalitarian communication medium? If the ideal in Web communications is to keep it short and simple, then you could argue that Orwell’s vision has been somewhat realized. And not just for text, but for audiovisual too: Media snippets in the forms of songs, short film clips, podcasts and the like point to short-form as the dominant format for Web content. It’s all short and to the point — Orwell’s vision realized.

I’m not sure Orwell would have envisioned his precepts taking hold as blog posts and IMs. But we can’t always choose the fulfillments of our visions.

by Costa Tsiokos, Sat 09/23/2006 08:06:50 PM
Category: History, Internet, Wordsmithing
| Permalink | Trackback | Feedback (1)

Tuesday, September 19, 2021

In the middle of my narrative rumination on an alternate political destiny for North America if the South had won the Civil War, I inserted a casual geographical development:

The [Louisiana Free State] maintains a balance of power through its extensive petroleum resources and the status of New New Orleans (formerly Morgan City, before the Mississippi River changed course through natural causes during the 1920s) as an international world-class trade center.

Anyone paying attention might have dismissed this as just a wholly fantastical plot device, as unreal as the general prospect of a balkanized North America.

But that betrays a poor understanding of the principles behind counterfactual historical speculation: The adherence to real-world developments, as closely as possible given altered circumstances. Therefore, the idea that the Mississippi River would re-route its output is not only rooted in fact, but also under serious consideration today, as a way to restore some balance to Louisiana’s coastline.

Not to worry, as the eggheads advocating a river run wild aren’t calling for a completely natural flow:

Simply letting the Mississippi shift to the Atchafalaya would do a lot for the sediment-starved marshes west of the Mississippi. But it would leave cities like Baton Rouge and New Orleans — and the petrochemical infrastructure between them — without fresh water or a navigable waterway.

The diversion the scientists propose would be much farther downstream, but where exactly is not at all certain. One possible location is near Davant, about 45 miles southeast of New Orleans. Another is near Empire, further down the river, where the levees could be opened. In either case the river flow into wet and marshy areas to the west. Another way would have to be found — or constructed — for ships to reach the shipping lane, possibly something engineers call a slack-water channel.

That Atchafalaya path, which was first detected during the 1950s, would indeed roll right past Morgan City. Thus my basis for the fictional New New Orleans (forming in the 1920s, on the assumption that U.S. Army Engineering construction never would have existed, thus leading to an earlier topographical realignment).

by Costa Tsiokos, Tue 09/19/2006 11:20:40 PM
Category: History, Political, Science
| Permalink | Trackback | Feedback

Saturday, September 16, 2021

In an almost literal forest-for-the-trees scenario, a swath of mountainside forestland in Kyrgyzstan that forms an aerial-view swastika has had locals buzzing for decades.

The mystery’s persistence is in its way surprising, given that as a Nazi swastika the symbol is imperfect, whether by design or because of uneven terrain. Hitler’s swastika was tilted 45 degrees; the formation here is almost level. Moreover, the arms do not mimic the Third Reich’s symbol, but its mirror image — a swastika in reverse.

Investigations center around World War II-era German prisoners of war somehow engineering this as an arboreal nose-thumbing, even though some say no such prisoners ever made it to this corner of the former Soviet Union. Absent some Nazi connection, no one seems to have an alternate explanation.

I think I do, though.

What struck me first was the location of this oddity: Near the Kyrgyz village of Tash-Bashat, “near the edge of the Himalayas”, as described in the Times article. That would be right next door to Tibet and the Indian subcontinent.

Secondly, the shape of this tree arrangement, dubbed the Eki Naryn swastika: A reverse mirror-image of the familiar Nazi symbol, and with a level orientation. Why that much of a variance?

Simple. The swastika symbol has a 3,000-year history, being used in various cultures long before Hitler’s crew co-opted it. One of the most prominent places where it appears with regularity is south Asia:

The swastika has held a place of great importance in India and Asia for thousands of years, and is widely used by Hindus, Jains and Buddhists.

The swastika is to be seen everywhere across the Indian sub-continent: sculptured into temples both ancient and modern, decorating buildings, houses, shops, painted onto public buses, in taxis - even decorating the dashboards of the three-wheeler motor rickshaws. Many religious and spiritual books display the symbol. It may well be the most prevalent symbol one will see in India.

I’m surprised this angle wasn’t mentioned. It seems much more probable to me, given the proximity of Tibet as a Buddhist center of influence. The particular style for this swastika points much more to Buddhist use than a Nazi one. Makes a lot more sense to me.

I’m not arguing for the preservation of this tree formation just on this possibility. While I appreciate the swastika’s historical record as a neutral symbol, let’s face it: The Nazis associated it with an unmistakably evil intent, and that’s not going to go away anytime soon (although I think it will, eventually). But I think the hunt for non-existent Nazis in this corner of the globe is pretty silly, when a more obvious explanation is apparent.

by Costa Tsiokos, Sat 09/16/2006 04:28:18 PM
Category: History, Political
| Permalink | Trackback | Feedback (1)

Monday, September 11, 2021

by the minutes
Props go to the New York Daily News for what’s hands-down the most strikingly effective cover image to commemorate today’s fifth anniversary of 9/11.

The timestamps from that morning say it all: First tower hit at 8:46, the second one at 9:03. Numbers in white, set against a stark black background. And a simple but firm motto above: “Remember”.

All day today, this cover continually caught my eye — which, for me and I think most others here, was an appropriate enough silent tribute. I can’t see how this cover wouldn’t win a bushel of design awards.

by Costa Tsiokos, Mon 09/11/2021 09:16:56 PM
Category: History, New Yorkin', Publishing
| Permalink | Trackback | Feedback (2)

Sunday, September 10, 2021

great black north
The origins of Canada’s name:

Despite quasi-comic rumors of explorer Gaspar Corte-Real bestowing the name in 1501 by marking the map of the newly-discovered land “ca, nada” (”here, nothing” in Portuguese), the factual source is more mundane, but with more symbolic significance.

It’s generally acknowledged that the Huron-Iroquois word “kanata”, meaning “village” or “settlement”, was adapted into the modern pronounciation/spelling. Therefore, Canada has the distinction of being the only nation-state in the Western Hemisphere to employ a Native American/American Indian word as a country name.

by Costa Tsiokos, Sun 09/10/2021 12:20:59 PM
Category: History
| Permalink | Trackback | Feedback (2)

Tuesday, September 05, 2021

airborne
The first move toward taking some of the “foot” out of football happened one hundred years ago today. The game’s first forward pass (then called a “projectile pass”) was thrown by Saint Louis University in a game against Carroll College, adding a new wrinkle in gridiron action.

Typically, that first pass ended up as an incompletion — and thus, as per the rules of the day, a turnover. But the second attempt by Saint Louis turned into a 20-yard completion for a touchdown, much to the amazement of Carroll.

Similarly, the concept didn’t catch on right away:

But it took awhile for the technique to take hold. For one thing, nobody knew how to pass, of course. And there were disincentives. A completion within 5 yards of the line of scrimmage was ruled a turnover. Oddly, a catch in the end zone was ruled a touchback.

And even though the rules changed to accomodate (and even encourage) passing, coaching orthodoxy took a long while to fully embrace the riskier-than-run play. Games bereft of a pigskin toss by both teams continued; in the NFL, the last such game took place in the 1950s. In college, some teams persisted with the ultra-conservative approach to offense, as Georgia Tech did in its famous 1976 upset of Notre Dame. It’s hard to envision today, but as with most sports innovations, it can take a long while for everyone to get on the same page.

Care to imagine what the college and pro games would look like today without the pass? I’m guessing something like rugby.

by Costa Tsiokos, Tue 09/05/2021 08:17:46 AM
Category: Football, History
| Permalink | Trackback | Feedback

Sunday, September 03, 2021

middle of the action
I’ve made brief (and not-so-brief) mentions here before about my hobbyist interest in the subgenre that is alternate history. And while Marvel Comics’ old “What If?” series first turned me on to the game-like concept of counterfactual historical divergences, there was another comic-book title, far more obscure, that cemented my fascination with fictional might-have-beens.

Captain Confederacy was a self-published little gem from Will Shetterly and Vince Stone. The writer and artist appear to be releasing this (presumably) long out-of-print series onto the Web in blog form. Which saves me from having to dig up my old back issues.

Even better, they’ve posted the boilerplate map that appeared on the inside cover of each issue. More than anything, this map (pictured above) captured my imagination and kept me with the series through its relatively short and erratic run. It’s obviously pretty crude — generated on a vintage 1987-era Macintosh, and following present-day state boundaries just a little too closely. But it was enough.

The original background history behind this alternate reality mapscape is, or will be, covered to some degree on the CC blog. Not everything was explained — Shetterly felt that leaving the development of a balkanized North America vague was part of the fun for readers (with which I agree). But I think I’ll provide my own alternate history explanations, as I recall them from the old Rebel Yell letters pages and with what I felt to be the most satisfying scenarios.

So, a country-by-country legend, according to the map’s numbers (which actually aren’t in an ideal order for my chronological-narrative purposes, but what the heck):

1. Confederate States of America (Caribbean and non-continental territories not shown). The chief action agent in this divergent history. Turned the tide of the War of Secession in 1862, when what became the Battle of Antietam in our history was instead — thanks to the non-discovery of Robert E. Lee’s written battle plans by Union troops — a successful Confederate seige of Washington DC.

Subsequent recognition of the CSA by Britain and France led to a peace treaty and some new international boundaries on the continent. The Confederate States would go on to add Cuba as a state (presumably in a Spanish-Confederate War similar to the factual Spanish-American War, minus the Pacific-Phillipines theater), and also annex Mexico’s Yucatan Peninsula and most/all of Central America. But not before some other territorial adjustments were made…

2. Louisiana Free State. The end of the War of Secession left various unresolved territorial and national-interest issues between the USA and CSA. In addition to persistent Confederate claims on Missouri and West Virginia, the US refusal to withdraw from New Orleans and southern Louisiana rankled Richmond, especially because it created a de facto safe haven for runaway slaves from adjacent Confederate states.

The United States insisted it needed to maintain its presence at the mouth of the Mississippi to protect its commercial interests on the river. Meanwhile, the growing influx of ex-slaves over the years fostered a volatile political and militant culture within the US-held enclave.

By the turn of the 20th Century, the United States and Confederate States would fight another war to settle their claims (prompted partly by the CSA’s strong victory during its recent war with Spain over Cuba, and colonial conquests in Mexico and elsewhere). In what came to be known as the Missouri War, the US successfully repulsed CS attempts to “liberate” Missouri and other previous slave-holding areas, and the CS managed to take coastal barrier islands the US had held since the end of their last war.

In the war’s New Orleans theater, while US and CS forces were locked in stalemate, radicalized blacks took the opportunity to declare the establishment of the Free State of Louisiana. The participation of Free State militias in the local fighting produced battle-hardened soldiers, and convinced the CSA that any reclaimation of southern Louisiana would come with an unwelcomed and protracted guerilla war.

Among other things, the end of the Missouri War brought a formal reliniquishment of CS claims to Missouri and West Virginia; the transfer of the Carolina barrier islands from the US to the Confederacy; US recognition of CS claims to Cuba and other Caribbean/Latin American colonies; and the international recognition of a new country, the Louisiana Free State.

Over the years, the LFS kept an uneasy co-existence with the bordering CSA. While officially not endorsing unrest among the Confederacy’s no-longer-enslaved (but still oppressed) black population, continued defections into Louisiana keep relations between the two countries edgy. The LFS maintains a balance of power through its extensive petroleum resources and the status of New New Orleans (formerly Morgan City, before the Mississippi River changed course through natural causes during the 1920s) as an international world-class trade center. A long-standing alliance with the United States is an additional keystone of LFS security.

3. United States of America (Caribbean and non-continental territories not shown). From its capital in Philadelphia, the USA has carved out a respectable regional power niche while surrounded by perennially-hostile neighboring countries (many on territories formerly held by the United States). To counteract the British-Confederate alliance on its northern and southern borders, the US has maintained a security linkage with the German Empire since the turn of the 20th Century. Also to check expansionist moves in Latin America by the CSA, the US established protectorates over the Dominican Republic and Puerto Rico, and secured the Panama Canal Zone to check the CSA’s Nicaraguan Canal.

4. Republic of Texas (southern boundaries not shown). Upon achieving independence, the Confederate States of America stretched from the Atlantic in the east to the Colorado River at its extreme western border (thanks to its modest military success in New Mexico, effectively laying claim to the southern half of that Territory).

By the dawn of the 20th Century, expansionist impulses in the CSA inspired military invasions into Latin America. Using revolutionary unrest and border incursions as pretext, the Confederacy launched a war of conquest into Mexico. Thanks to proximity, this Mexican campaign was led largely by Texan troops. By the time Mexico sued for peace, the CSA had overrun the northern part of the country and the Yucatan, and subsequently annexed them.

The immediate post-war period brought discontent among factions of Confederate society who favored the annexation of all of Mexico. In particular, Texas felt it had contributed a disproportionate share of the war effort, only to end up with an incomplete result. Added to this was anxiety over the continued US occupation of nearby southern Louisiana, and the perception that Richmond wasn’t doing enough to resolve that situation. In part due to these internal pressures, and also due to emboldened confidence from recent military successes, the CSA embarked upon the Missouri War soon afterward.

The result of that war — essentially a blunting of growing Confederate power, if not an outright defeat — led to an acceleration of dissatisfaction in Texas. Another unfulfilled military outcome, made even worse with the establishment of the black-governed Louisiana Free State on the state’s border, brought to the fore calls for Texan secession from the CSA. Popular sentiment supported political maneuvers in this direction, and within a couple of years of the end of the Missouri War (circa 1910), Texas formally seceded from the Confederate States and re-established the Republic of Texas.

In a move that surprised the world stage, the Confederacy acceded to Texas’ actions. The sentiment in Richmond, especially with Confederate President Woodrow Wilson, was that a country founded upon the principles of secession couldn’t justify military action against that enshrined legal right. Pragmatic reasons also factored in: A war with Texas invited intervention by the United States, possibly leading to a permanent US-Texan alliance. Additionally, the blossoming Confederate Caribbean empire was already stretching the country’s resources. Parting with Texas on friendly terms seemed the safest course.

Texas embarked upon an existence as an independent nation-state, under certain conditions. Contingent upon its divorce from the CSA, it granted the Confederacy naval port leases on the Gulf and Pacific coasts in exchange for undisputed sovereignty over New Mexico and the newly-conquered Mexican territories. It also agreed with the CSA to preserve the remnant rump state of Mexico, chiefly as a buffer between Texan territory and Confederate holdings in Yucatan/Central America.

Throughout the rest of the 20th Century, Texas benefited from its vast petroleum resources, leveraging them into a role as a power-broker in North America and beyond. Despite its early-history hostility toward it, Texas developed close ties with Louisiana Free State, both as a buffer versus the Confederate States and through their mutual membership in OPEC.

5. Great Spirit Alliance (northern boundaries not shown). A consequence of the Anglo-French brokered end to the War of Secession was the creation of a British-backed Native American independent state. Ever since the War of 1812, Great Britain had been seeking such a buffer nation as protection for its British North American holdings, and it seized upon the opportunity to set one up on the Great Plains.

Initially composed of the Plains tribes indigenous to the area, the so-called Indian Territory was supplemented by forced and unforced relocations of other North American tribes from British North America and the Confederate States (but not from the United States, which wasn’t inclined to boost the population of a hostile country).

Eventually, elements in the Indian Territory, encouraged by independence movements elsewhere, formed the Great Spirit Alliance, a tribal-based governing system mutually allied against surrounding states. The GSA spearheaded independence from British influence, allied itself with the metis settlers in the Canadian prairies for a push to the north, and established itself as a Native American homeland in the heart of North America. Mistreatment of Natives elsewhere in the Americas keep Great Spirit relations with neighboring countries frosty.

6. Deseret. The end of the War of Secession took huge chunks out of the United States’ pre-1860 boundaries, but still left the country with contiguous territories from the Atlantic to the Pacific. However, that integrity hung by a thin reed.

In Utah Territory, embittered Mormons considered the success of the Confederate States in achieving independence. Having had their own clashes with the Federal government since before their exodus to the Great Salt Lake Desert, Mormon leaders saw the establishment of their own independent government as the surest means toward preserving their isolation and way of life.

For its part, the United States now saw Utah as an essential overland bridge to California and the Pacific Coast. Following the war, Philadelphia placed renewed priority on completing the Trans-Continental Railroad, framing it as a lifeline to holding what remained of the country together.

Seeing an opportunity to destabilize the US further and expand its North American influence, Great Britain (and to a lesser extent, the CSA) dispatched agents among the Mormons to encourage revolt and promise support. Encouraged by this, Mormons openly rebelled against US authorities, and by the end of the 1860s declared an independent state of Deseret. With British support, Deseret claimed Nevada and the western portion of what remained of US-held New Mexico Territory, along with Utah.

Despite the obvious consequences — the cutting off of the Pacific coast from the rest of the country — the United States was largely forced to accept the breakaway of Deseret. War with the Mormons, coming on the heels of the massive losses from the War of Secession, was too much for a weary American public to accept. Popular sentiment was that Mormon country wasn’t worth a war to keep, especially if it led to a wider conflict with the British and Confederates.

Deseret thus managed to establish its freedom. The country, while denying accusations of being a theocracy, nonetheless has been dominated by the leadership of the Church of Latter-Day Saints for its entire existence. Relatively resource-poor, it has relied upon extensive trade agreements with neighboring countries.

7. People’s Republic of California (southern boundaries not shown). The establishment of Deseret to the east cut off California from the rest of the Union, exacerbating already-existing feelings of disassociation. With the lack of direct rail or other overland routes with the US, California declared itself a republic in 1875. The new country was immediately recognized by the Confederate States, Great Britain, France and other powers, and while relations were strained with Philadelphia, California managed to gain independence without warfare.

California grew slowly over the next couple of decades. Wary of foreign influences, the country forged alliances with France and Russia as bulwarks against the CSA and Britain. It also began to establish an informal sphere of influence over neighboring Baja California in Mexico; this led to the formal annexation of the peninsula during the CSA invasion of Mexico in the early 20th Century.

California’s preoccupation with continental affairs left it unprepared for a new challenge from across the ocean. Seeing California as an ideal trans-Pacific base for its growing empire, Japan exerted military influence over the North American country beginning in the 1910s. By the end of that decade, Japanese naval bases were established in Los Angeles and San Francisco, and California was effectively a puppet state. A Japanese-sponsored People’s Republic was declared in the 1920s.

In the years since, Asians (particularly Chinese expatriates from other parts of the Japanese Empire) have become the dominant population group in the People’s Republic. While nominally independent, California remains firmly within Japan’s sphere of influence.

8. Pacifica (northern boundaries not shown). Along with California, Oregon Territory was cut off from the rest of the United States when Deseret achieved independence. While California opted for independence, British agents induced settlers in the Pacific Northwest to join British North America. Threatened by the newly-formed Indian Territory to the east, and not expecting substantial protection from Philadelphia, the residents in sparsely-settled Oregon opted to place themselves under British protection, becoming an extension of British Columbia.

The proclamation of the Great Spirit Alliance in the former Indian Territory, and its subsequent northward expansion, prompted British Columbian action. The remote British North American province declared independence under the name Pacifica.

Pacifica has maintained good regional relations. It has particularly close ties with its northern neighbor Alayeska, supporting the displaced Tsarist regime against its rival Soviet government in Russia.

by Costa Tsiokos, Sun 09/03/2021 11:23:17 PM
Category: Creative, History, Publishing
| Permalink | Trackback | Feedback (5)

Page 3 of 7«12345»...Last »