- Email Signup
- Contact Us
- Progressive Party Positions Table
- Iraq & Syria
- Progressive Party 2014 Voter Pamphlet Statement
- Cease negotiations of TPP
- Ferguson & Inequality
- Police Body Cameras
- 28th Amendment to U.S. Constitution
- Health Care
- Environment (draft)
- Financial (draft)
- Foreign Relations (draft)
- Labor (draft)
- Market (draft)
- Political Reform (draft)
- Social Issues (draft)
- End Political Repression
- Pembina Propane Export Terminal
- Progressive Platform
- Register to Vote
- Party Structure
- Flyers, Buttons, Posters, Videos
Smart Politics in Pictures and Words
Updated: 37 min 51 sec ago
Originally published by ANewDomain:
As a frequent traveler to and author of several books about the former Soviet republics of Central Asia, I was surprised by the news that the FBI arrested a citizen of Kazakhstan along with two men from Uzbekistan for attempting to travel to Syria to join the Islamic State of Iraq and Syria.
Although Kazakhstan has a majority Muslim population, it is a highly secular culture where radical Islamism has had less success attracting adherents than in neighboring countries. Walk the streets of major cities like the capital of Astana and the intellectual center Almaty and you will see casinos, bars, men smoking and drinking beer and vodka, and countless women in miniskirts and tightfitting blouses.
These troubling arrests – they practically fit the dictionary definition of entrapment, the federal government’s definition of “material assistance to a terrorist organization” is overly broad, and anyway, why should it be illegal to go and fight for a foreign army that isn’t legally at war with the United States? – are still a developing story, so what follows necessarily relies upon speculation.
Akhror Saidakhmetov, 19, is the youngest of the three. The feds intercepted him at John F. Kennedy International Airport in New York City early Wednesday morning, while trying to board a flight to Istanbul. Turkey is a typical transit route for would-be ISIS recruits trying to get into Syria.
My off-the-cuff assumption was that his radicalization must have been influenced by his fellow suspects, both of whom are from Uzbekistan, particularly his roommate and former fellow restaurant worker, 24-year-old Abdurasul Juraboev. But that may not be the case.
Saidakhmetov is from the southern Kazakh city of Turkistan. He left for the United States at age 16 and has not been back.
According to the Kazakhstani Foreign Ministry, however, he is listed as an ethnic Uzbek.
The third man, Abror Habibov, 30, was arrested in Jacksonville, Florida.
If the Uzbek connection turns out to be a central thread in the three men’s desire to join the Islamic State, a Taliban-style attempt to reboot the caliphate eliminated at the end of World War I and establish a medieval interpretation of sharia law in the Middle East, it would not be surprising to those of us who pay attention to Central Asia. When I heard that the three were all ethnic Uzbeks, I immediately thought:
Islamic Movement of Uzbekistan.
The Fergana Valley is a mountainous geographical knot connecting Uzbekistan, Kyrgyzstan and Tajikistan. Long a hotbed of Islamic extremism, particularly among ethnic Uzbeks, Fergana is the center of power of the Islamic Movement of Uzbekistan. The IMU, whose members attended Afghan training camps during the Taliban era in the late 1990s, is dedicated to the overthrow of Islam Karimov, the authoritarian dictator of Uzbekistan.
Given their ages, it’s unlikely that any of the three men, including Habibov, were members of the IMU. In the age of radical jihad, however, self-radicalization is inspired by the ideology in the air around you. If you’re fundamentalist and Muslim and radical in Uzbekistan, or still have ties to that country, the IMU comes with the territory the same way that growing up Irish and Catholic in the 1970s, and resenting the British occupation forces, necessarily leads one to embrace, if not join, the IRA.
All of the Central Asian republics are seriously screwed up, and all of them are run by authoritarian despots, but none are nearly as heinous or universally despised by their citizenry as Karimov.
Karimov, a Communist Party boss who kept his job after the fall of the USSR, runs one of the most violent and corrupt dictatorships in the world. Among other atrocities, he has personally supervised the massacre of hundreds of peaceful demonstrators and ordered political dissidents to be either boiled or frozen to death. Central Asia watchers have long expected Karimov-related blowback.
When I traveled in Uzbekistan, everyone I met – secular or religious, regardless of ethnicity, wherever they stood on the spectrum of political ideology, young and old, male and female, rich or poor – despised Karimov, and wished for his speedy painful death. Unfortunately for the people of Uzbekistan, that’s not going to happen anytime soon. That’s because he is one of America’s best friends in the so-called global war on terror.
It is not difficult to imagine three young Uzbek men, struggling to make their way in New York City, feeling resentment against the West and in particular against the United States, which has long propped up a regime which has looted spectacular amounts of wealth from and abused their countrymen. Was this a case of chickens coming home to roost, or simply three guys who were led astray?
Sooner rather than later, I suspect that we will find out. Whatever the case, US foreign policy has contributed to radicalization in a Central Asia that, after 1991, could have easily gone the other way had we simply let their domestic political situations sort themselves out, rather than insist upon supporting a group of ruthless tyrants who were wildly unpopular among their own people, simply to cut deals for cheap oil or natural gas or to lease airfields for American military operations.
“Although Central Asian governments have attempted to crack down on extremism within their borders, analysts suspect that ISIS has effectively targeted Central Asian nationals for recruitment,” reports the Christian Science Monitor. That’s what happens when you alienate people by giving them nothing to lose: the beneficiaries are inevitably the most extreme groups, like the Islamic State. “A report published last month by the Brussels-based International Crisis Group claimed that up to 4,000 recruits from Central Asia had joined ISIS in Syria and Iraq. Many of these recruits are from the Fergana Valley, an ethnically diverse region that includes eastern Uzbekistan. The Kazakh National Security Committee estimates that about 300 from that country, about half of them women, are fighting in Syria for ISIS.”
Originally published by The Los Angeles Times:
The police shooting of a 39-year-old homeless man in the skid row section of downtown Los Angeles is prompting comparisons and reactions familiar to those that followed police killings of unarmed black men in Ferguson, Mo., and New York. The identity of the man is still not clear, but he was known as “Africa” to some who knew him on the streets.
The incident is still under investigation but many question how dangerous a man without a gun can be to four highly trained law enforcement professionals, all armed. The LAPD says its officers first approached Africa in response to a robbery call, and that its officers shot the man to prevent him from taking one of the officers’ guns. The revelation that Africa was a convicted bank robber who served a long prison term seems to bolster the image of a dangerous person. In Ferguson, police also pointed to the victim’s alleged involvement in a robbery.
Then there’s the context of lousy community relations. “Skid row has been home to police occupation under the Safer Cities Initiative,” Steve Diaz, an organizer for the Los Angeles Community Action Network, said at a meeting of the Los Angeles Police Commission’s weekly meeting. “They clear people out in the name of gentrification.”
Since at least one of the LAPD officers was wearing a body camera, the investigation is also being viewed as a test case for a technology that advocates hope will hold rogue cops accountable and defend honest ones against folks’ charges of brutality. The claim of a St. Louis man that a policeman turned off his dashboard cam before beating him, following a similar story in New Orleans late last year, has skeptics wondering whether videotaping really is a solution in such cases.
Maybe it’s because I’m old enough to remember domestic policing before it was militarized and excessive force became the norm, but for me this is as much a story about officers who escalate violence far too quickly as it is about other relevant issues, such as racism.
Writing about Michael Brown, the man killed in Ferguson, a letter writer to the Wall Street Journal noted: “It is unacceptable that Officer Darren Wilson had access to a Taser and intentionally didn’t carry it. We will never know whether a Taser would have de-escalated the encounter between Officer Wilson and Michael Brown, and prevented Mr. Brown’s death.”
What should be something from Kindergarten Cop 101 has gotten lost in many cases: Police should do everything they can to avoid violence in the first place. Then, if a peaceful resolution isn’t possible, force should be escalated gradually. That did not appear to be what happened in Brown’s case. And Tamir Rice, the 12-year-old Cleveland boy shot to death two seconds after a rookie officer got out of his police car, didn’t appear to have a chance to cooperate or surrender. Akai Gurley, 28, was killed by a bullet fired by a nervous NYPD officer who heard a noise in the dark stairwell of a housing project.
You can’t blame police officers for being scared when they confront possible suspects. But it’s fair to expect proportionality based on, first of all, the alleged offense. It’s hard to tell from the video in the L.A. case, but there is reason to suspect that this incident moved from confrontation to physical engagement way too quickly. Then there’s the proportionality of physical force: You’ve got four armed officers taking on an unarmed man. Frankly, if they don’t have what it takes to keep the guy down, they need to go back to the police academy.
Originally Published by ANewDomain:
A couple of weeks ago, I wrote an essay for BreakingModern about the Brian Williams scandal, and how it reflects the sick cult of militarism that has ruled America and its media since 9/11. “You can tell a lot about a society’s values from its lies,” I said.
Williams’ attempt to portray himself as some kind of bad-ass journo-soldier was preceded by Hillary Clinton’s false claim of dodging enemy fire in Bosnia and Connecticut Senator Richard Blumenthal’s lie that he had served in Vietnam.
Now two more public figures are being accused of ginning up accounts of courage in war zones.
FOXNews host Bill O’Reilly is fending off charges that he repeatedly bragged about dodging bullets during the Falklands war though he never came close to the war zone, having been confined by Argentine authorities to the capital of Buenos Aires. He is attempting to defend himself by saying he was caught in a riot there, where he was shot at by police or soldiers, but most of his fellow CBS veterans remember it differently.
O’Reilly, characteristically aggressive, threatened a reporter for the New York Times that he would retaliate if they weren’t fair to him: “I am coming after you with everything I have,” he said. “You can take it as a threat.”
Disclosure: I have appeared several times on “The O’Reilly Factor.” A statement by one of O’Reilly’s accusers sums up my experience: “Nobody gets a fair shake. just wants to beat them up, call them names.” I didn’t care about that, but I’m still annoyed about the fact that, when I went on his show at the beginning of the occupation of Afghanistan to predict that the war would go badly for the United States, he promised on the air — after mocking me and questioning my patriotism — to have me back later to see who was right.
I sure would have liked to have performed my little victory dance.
And now there is another apparent case of an armchair warrior pretending to have served in the military: US Department of Veteran Affairs Secretary Robert MacDonald. He apologized yesterday for saying that he had served in the Army’s elite Special Forces.
In fact, MacDonald graduated from West Point in 1975 and served in the 82nd Airborne Division – hardly the resume of a wimpy pacifist unqualified to attempt to unravel the hot mess that his department has become.
As I wrote about Brian Williams, the O’Reilly and MacDonald cases tell us a lot more about contemporary American culture and the cult of militarism than they do about these two guys.
If it matters, Williams has been in harm’s way in war zones. O’Reilly has an enviable career as a successful TV and radio host, best-selling author and yes, a prehistory as a real journalist.
MacDonald really was a soldier, if not the exact kind of bad-ass super trooper he was compelled to present himself as.
The point is, why do these people, who are incredibly accomplished professionally and in some cases have demonstrated real courage under fire, feel tempted to puff themselves up in this particular way?
We have developed a martial culture to the exclusion of all else. You don’t see teachers thanked for their service on television – hell, you don’t really see teachers on television much at all. Nor nurses. Nor musicians. Nor playwrights. Nor artists. In the United States in 2015, the way that you get people to show you deference is to claim to have fought in one of America’s many optional wars of aggression or, failing that, to have gotten caught in the line of fire as a journalist, or perhaps a former hostage.
If you don’t see that there is something terribly wrong with that, odds are you are either part of the problem or one of its victims.
Originally Published by Breaking Modern:
Used to be, bands peaked out early.
The good ones cranked out three or four great albums before drugs, complacency, exhaustion, fights over girlfriends or money began to take their toll. Then they’d either break up or, not being suitable for gainful employment or lacking the imagination to try something new, soldier on.
It’s a sad old story for generations of oldsters before you. As time passed and many of the fans moved on, diehard loyalists made do with new albums and then CDs that sounded enough like the glory days to keep them satisfied and turning up at concert halls. This kept grizzled old rockers on time with the rent — but rarely if ever achieving the magic spark of the early years.
This happened even to the most brilliant rock and rollers any music freak alive today has heard about and even listened to.
Take David Bowie. He was good for seven or eight iconic albums, but since the early 1980s we lost all hope of another soaring achievement at the level of, say, Aladdin Sane.
And the once great Elvis Costello turned out one amazing disc after another between 1977 and 1988, but by the mid-1990s it had become clear that, despite his admirable willingness to stretch outside of his comfort zone with collaborations with artists working in other genres, the new stuff was pretty much only going to appeal to Declan’s hard-core fans.Look, you spend the first 20 or 30 years of your life accruing experiences for songs that go into your first few records. Then you become a professional musician, and pretty much the only thing that can go into your new stuff is what you did last year, while you were touring and negotiating with your surly record company.
And unless your wife or mistress (or both) dumps you, your muse just doesn’t have that much to work with.
But the old slow-fade dynamic appears to be a thing of the past.
I don’t know if musicians are responding to the fiscal pressures of digitalization, which has made it more difficult for creative types to monetize their work, or maybe it’s just a 2015 thing. I’ve been amazed, lately, at some of the great new music old bands now are releasing.
Now, this isn’t like Bob Dylan, whose every musical fart is always greeted by corporate music media as though, well, it didn’t really suck. That guy was old when I was a kid and he’s been boring for years. We’re talking about bands who have been around a long and actually really keep getting better. Like they practice. Or something.
Bear in mind, many of these reboots result from the kind of personnel changes that typically destroy bands. I mean, imagine if the post-Jim Morrison Doors LPs were as good or better than even the original Doors, as opposed to the notorious disasters they actually were. Imagine if it not only didn’t matter that guitarist Mick Jones of the Clash – who wrote most of the songs – was missing from “Cut the Crap.”
Is such a thing even possible?
Well, maybe so. Or maybe the veteran performers whose new stuff is so good benefited from never playing huge arenas or being able to afford distractingly large mounds of cocaine. Clean living and poverty have some real awards to artists.The Buzzcocks: Brit Punk Gone Wild
Consider legendary British punk rockers, The Buzzcocks, and its most recent album. Not so excitingly named “The Way,” it continues a remarkable forward movement for a group that burst on the scene with androgynous lyrics about the politics of romance and relationships going back 40 years ago.
Like many bands from this punk generation, The Buzzcocks broke up in the 1980s and reformed in the 1990s. And this band returned full force. It kept its signature buzzsaw guitars and still maintains its core concerns, all while evolving its signature sound and songwriting chops.
Highlights of the reunion period include “Modern” (1999), the self-titled non-debut “Buzzcocks” (2003) and 2014′s “The Way,” which switches back and forth between songs written and sung by Pete Shelley and Steve Diggle. “Virtually Real,” about social media, would come off as contrived and insipid in the hands of lesser social satirists. In the hands of Buzzcocks, it’s a gem.Client: Frosty English Electronica
Hard-hitting English electronica band Client has reveled in mystery since its founding in 2002. The group’s two original members were identified only as Client A and Client B, and their images never appeared on their CD artwork. Blending retro 1980s synthesizers and frosty lyrics influenced by the late singer Nico and the early 1980s French Cold Wave movement led by KaS Product, Client was a reliable favorite – until Sarah Blackwood, the lead singer of Dubstar formerly known as Client B, left the band.
This is one of those situations that usually spells music death. Yet the 2014 CD “Authority” not only maintained enough of the original musical and conceptual aesthetic to satisfy existing fans but moves things forward with more forthright political commentary on the nature of oppression in the 21st century, all to an inevitable dance beat set behind a new singer whose voice is different enough from Blackwood’s to carve out her own territory while moving the band forward.
Don’t get me wrong: I still love the old albums. But the new one is just as good, if maybe a bit more contemporary.the dB’s: a Return to ’80s American Pop Power Form
For my money the seminal American power pop band the dB’s never recaptured the highs of their somewhat neglected 1984 masterpiece “Like This.” Yet here we are, three decades later, after a series of on-again off-again albums, including the insanely flat 1994 “Paris Avenue,” with “Falling Off the Sky.”
Okay, so this one came out in 2012, but I didn’t notice and neither did many other people so I’m talking about it now.
Critics like to say this a lot, but this really is a true return to form, plus it moves the band forward in a way that doesn’t spell “old.”The Adverts: TV Smith and Melodic Post Punk
Of the many unjustly overlooked musical artists out there, there has never been a bigger gap between soaring talent and popular obscurity than that of singer-songwriter TV Smith, formerly the lead singer of the Adverts, who were contemporaries of the Buzzcocks in the late 1970s in the UK. Smith writes heart-wrenching, droll elegies to those crushed by the steamroller of heartless capitalism (e.g., “It’s Expensive Being Poor“) to delightfully melodic postpunk.
Year after year, he puts out one CD after another, each better than the one before, which was itself amazing. Most recent was last year’s “I Delete,” which blends elements of classic late 1970s British punk, 1980s hair metal, 1990s grunge, early 21st century postproduction gimmickry and pretty much everything else that has ever mattered to me. Lots of amazing songs here, but “It Don’t Work,” about the feelings and failings of technology on both a personal and political level stands out. It’s unbelievable to me that this is a guy who made it big with a 1970s novelty song, “Gary Gilmore’s Eyes.”Frank and Walters: Irish Alternative Bond
Finally, another revelation, which thanks to the Internet I just found out about even though the thing came out in 2012, is that the Frank and Walters, an alternative rock band from Ireland famous for their jangly guitars and beautiful, winsome lyrics about the nature of desire who formed in 1990, got back together and issued a new CD, “Greenwich Mean Time.” Here the triumph isn’t so much that they moved forward. They didn’t.
“Mean Time” sounds like they never went away. It’s a seamless transition from 2006 to 2012, which is kind of amazing when you think about it.
Sometimes, when you love a band, more of the same is good enough. And sometimes, rarely, it might even be better.
Originally published by ANewDomain.net:
“When did Americans decide that allowing our kids to be out of sight was a crime?” asks a mom in the northern suburbs of Washington DC whose husband was threatened with arrest by child welfare agents who said they would take away their kids – for the “crime” of allowing their 10-year-old son to walk home free of adult supervision from a nearby public park.
Danielle Meitiv cites other examples of what appears to be a growing trend: the criminalization of free-range childhood. In the Washington Post, Meitev writes:
Last summer, Debra Harrell of North Augusta, S.C., spent 17 days in jail because she let her 9-year-old daughter play at a park while she was working. In Port St. Lucie, Fla., Nicole Gainey was arrested and charged with neglect because her 7-year-old was playing unsupervised at a nearby playground, and Ashley Richardson of Winter Haven, FL, was jailed when she left her four kids, ages 6 to 8, to play at a park while she shopped at the local food bank.”
Lenore Skenazy sparked controversy with a 2008 New York Times essay bearing the self-explanatory title “Why I Let My 9-Year-Old Ride the Subway Alone.”
“Was I worried? Yes, a tinge,” Skenazy admitted. “But it didn’t strike me as that daring, either. Isn’t New York as safe now as it was in 1963? It’s not like we’re living in downtown Baghdad.”
Actually, in downtown Baghdad, kids are everywhere.
How old must a child be to be left alone at home? Only five states set a legal limit. (I wonder how many Illinois parents know they are risking child endangerment charges by trusting their 13-year-old not to burn down the house?) As a guideline, experts currently say that, while it depends on the psychological maturity of the child, 7 to 10-year-olds can handle short periods on their own and that kids over age 12 can go a whole day but nevertheless shouldn’t be left home overnight without an adult at home.
To answer Meitiv’s question, there appears to have been a major transformation during the 1990s and 2000s in attitudes about balancing the competing concerns of keeping kids safe and fostering the independence necessary to mature into adulthood.
Growing up in a suburb of Dayton, Ohio to the 1970s, I remember adults were downright cavalier about children. Starting in third grade, I walked to and from school in all kinds of weather. It was two miles each way, a significant distance on those short little legs, especially during an ice storm. (School superintendents were stingier with snow days back then.)
I rode my bike all over town, especially during the summer to the swimming pool, which was about a seven-mile round-trip. My mother wasn’t especially neglectful; every kid I knew carried, as I did, a pocket full of dimes for a pay phone in case they got in trouble.
The irony is that back then, when parents were running off to “key parties” and letting their kids be babysat by “Gilligan’s Island” reruns, it was a far more dangerous time to be a child in America than it is now, when local law enforcement is cracking down on people who refuse to be helicopter parents.
Street crime has plummeted since when I was a kid in the 1970s. It’s not like predators were snatching children off the streets all the time, but it wasn’t unheard of. Twice before turning 16, sketchy men tried to lure me into their cars. A mile up Route 48, the same street where I walked to high school every morning, a serial killer kidnapped, raped and murdered a 14-year-old girl going to her own school. Most kids from the 1970s generation have a story like that, one or two degrees of separation removed.
That’s not the case now.
Of course, if you are a would-be child killer, it’s going to be pretty difficult to satisfy your bloodlust in a society where you never see kids walking the streets.
Keeping kids safe is a parent’s primary responsibility. People my age – I’m 51 – ruefully recall feeling like no one cared about our safety when we were children. We shouldn’t return to that era. But parents have another, equally important duty: turning their kids into grown-ups.
How the hell are today’s kids going to become the adults of tomorrow?
When I was nine years old, my mom let me take the city bus downtown to Dayton’s edgy urban core. I have to think that familiarizing myself with mass transit slowly, during my teenage years in a smaller city, made it easier for me to transition to the New York City subway, which I had to figure out at the age of 18 as a student at Columbia University.
Similarly, although sometimes I worried that my mother had gotten into a car accident when she ran late at work, it was a good experience to learn, again over time, that 99% of the time there’s nothing to fear even when you are afraid. Besides, being left home alone today would have been a less fraught experience thanks to text messaging and cellular phones.
By the time I was 15 years old, I had a pretty good sense of direction. We didn’t have Google Maps, but we had the printed kind, and the experience of driving around and sometimes getting lost, so we soon had a strong sense of where things were and how to get there. You need that as an adult. As I watch my friends shuttle their kids around by car, I always wonder, how can these children – who have absolutely no reason to know or care where they are being taken or how it fits into context – make the jump into fully realized independent adulthood?
“The pendulum has swung too far,” Meitiv wrote, and I agree. “We need to take back the streets and parks for our children. We need to refuse to allow ourselves to be ruled by fear or allow our government to overrule decisions that parents make about what is best for their children.”
This being America, it’s probably going to take a bunch of legal battles in the form of parents fighting back against out-of-control child welfare authorities — who in 45 states are “enforcing” non-existent laws — to restore some sense of sanity. In the meantime, we are engaged in a social struggle that will determine whether the first totally online/totally protected generation of American children somehow manage to develop into viable adults.
Originally published at The Los Angeles Times:
I have never understood why society tries to prevent people who want to die from doing so.
As a young Catholic, I was told that suicide was a sin. My priests couldn’t explain why. Besides, assuming that the suicide attempt is successful, you’re dead. Who’s going to shake you down for penance?
But assisted suicide — helping someone kill himself or herself, typically to bring an end to a painful terminal illness — is illegal in most of the United States, including California. But a proposed bill could change that.
“California’s new legislation is modeled on Oregon’s, making assisted death available to those 18 or older who are diagnosed with a terminal illness that is expected to result in death within six months, provided they are mentally capable of making healthcare decisions. The proposal would go further, giving pharmacists and physicians legal immunity in such deaths,” Patrick McGreevy reports in The Times.
Passage is anything but a shoo-in, in part due to opposition by religious groups. It’s almost as if the living are members of a club, and anyone who wants to quit makes those who remain feel insecure.
But there are also valid reasons to worry that unscrupulous political leaders and decision makers in the healthcare industry could abuse legalized euthanasia. Not least is the example of the Nazis’ so-called euthanasia program of killing disabled people in gas vans and mass shootings, which began before the Holocaust.
The government reserves the right of life or death over its citizens. But if one of them want to choose death, even when she is desperately ill and in pain, the government says it’s illegal.
Caption The government reserves the right of life or death over its citizens. But if one of them want to choose death, even when she is desperately ill and in pain, the government says it’s illegal.
Advocates for disabled people also are against the proposal, which could become a ballot proposition should it fail in the Assembly. They don’t trust the corporate healthcare industry: “There’s a deadly mix when you combine our broken, profit-driven healthcare system and legal assisted suicide, which would instantly become the cheapest treatment,” Marilyn Golden, a senior policy analyst for the Disability Rights Education and Defense Fund, told Times columnist George Skelton. Death panels, in other words.
This is a complicated issue. But my sympathies lie with the right of sick individuals to decide — free of pressure from family members or hospitals looking to save money — if and when they have suffered enough and want to opt out. If there’s a chance for recovery, doctors clearly have a moral obligation to pull out all the stops. When there’s not and the future only points to more suffering, not so much.
The inspiration for this week’s cartoon is the paradox of a government that cavalierly kills people regularly, at will and often unnecessarily. It does so with drone strikes overseas, or executing inmates who pose no threat to anyone while they are locked up in maximum security prisons here, or more often through neglect — by failing to help the poor, unemployed, underemployed, mentally disabled and other marginalized people.
Government wants to decide whether you live or die, but it won’t give you the same choice. It’s sick.
“American Taliban” John Walker Lindh stripped nude and tortured after his capture.
There have been several high-profile arrests of wannabe jihadis who allegedly intended to fight with the Islamic State of Iraq and Syria, including three New York City residents last week, charged with providing “material support and resources…to a foreign terrorist organization.” They each face up to 15 years in prison.
Over the last year the United States has intercepted and arrested at least 15 young Muslims for wanting to join ISIS.
If I went to Syria to join ISIS, I could be arrested and charged with felonies that carry long prison sentences.
As citizens of a supposedly free country, Americans ought to be able to travel anywhere on the planet, and fight for any army we please, as long as that force is not at war against the United States. This, by the way, has been American law for the last 120 years.
Neither ISIS nor the United States have declared war against one another. (Since the U.S. does not recognize ISIS as a nation-state, they wouldn’t be able to do so.) Anyway, ISIS is more of a frenemy: the Obama Administration was still funneling money, weapons and trainers to the insurgent factions that metastasized into the Islamic State in their war against Syrian President Bashar al-Assad well into 2014. We still want them to beat Assad…or do we?
The Center for Constitutional Rights complains that the “material support” statute governing these prosecutions is overly broad because along with the USA Patriot Act it criminalizes “almost any kind of support for blacklisted groups, including humanitarian aid, training, expert advice, ‘services’ in almost any form, and political advocacy.” It’s downright absurd when the blacklisted “terrorist group” in question was a U.S. ally until last summer.
It ought to go without saying that I have no sympathy for ISIS. Their ideology is idiotic, medieval and repugnant. Among numerous other atrocities, they kidnap, torture and execute war correspondents — my colleagues. Last week’s video of ISIS fighters destroying archeological treasures at the museum in Mosul, Iraq had me shouting “barbarians!” at my screen. They’re disgusting.
But I am also disgusted by the U.S. government’s imperialistic campaign to trample the sovereignty of other nations in their attempt to dominate the entire world. Not only does the U.S. invade other nations without just cause, it routinely violates countries’ airspace with drones, airstrikes and assassination raids. The U.S. arrests non-U.S. persons for acts committed outside the U.S., kidnaps them, prosecutes and jails them in the U.S.
If you want to join the French Foreign Legion or the Australian Coast Guard or the Taliban or ISIS, it’s your stupid business — unless, as I said above, a formal state of war exists between them and the United States (which would be treason, punishable by death).
There is a long history of Americans traveling abroad to fight in foreign armies. American volunteers in the Abraham Lincoln Brigades defended the Republican government against Franco’s fascists in the Spanish Civil War of the late 1930s. In the 1980s thousands of American internationalistas fought on the Sandinista side in Nicaragua against American-backed right-wing death squads. Because they fought for left-of-center causes, they were accused of ideological subversion by reactionary government officials — but, thanks to an 1896 court ruling, they weren’t prosecuted.
Over 1000 Americans serve in the Israeli Defense Forces.
As with so many other basic legal precepts, your right to serve in a foreign army has been eroded since 9/11, marked by the prosecution imprisonment of “American Taliban” John Walker Lindh. Lindh joined the Taliban in 2000 and was captured by U.S. forces during the fall 2001 invasion of Afghanistan. He received a whopping 20 years in federal prison for “providing services” to the Taliban and “carrying an explosive” (which, as a soldier in a war zone, is hardly unusual).
At the time I was one of the few pubic figures — perhaps the only one — who criticized the Bush Administration’s treatment of Lindh, who was brutally tortured by American troops. Lindh, I pointed out, joined the Taliban before 9/11. Even after 9/11, the U.S. never declared war against Afghanistan — so he should have been repatriated without punishment.
Prosecutions under the “material support” statute escalated following the media’s passive acceptance of the lengthy prison sentence for Lindh.
Locking people in prison for the crime of youthful idealism/naiveté is a perversion of law and morality. They are not a threat to the U.S.
Young men and women who successfully make it into Syria and join ISIS shoot at Syrians and Iraqis. The only Americans they might endanger are U.S. occupation troops assisting collaborationist Iraqis — who are there illegally, in an undeclared war. What we think of ISIS is irrelevant; many countries are ruled by vile despots.
From a practical standpoint in this war for hearts and minds, throwing kids who have never fired a shot into federal penitentiaries for ridiculously long prison terms confirms the narrative that the West is at war not with Islamic extremism, but with Islam itself.
As an American, I hate to see us lose another right.
(Ted Rall, syndicated writer and cartoonist for The Los Angeles Times, is the author of the new critically-acclaimed book “After We Kill You, We Will Welcome You Back As Honored Guests: Unembedded in Afghanistan.” Subscribe to Ted Rall at Beacon.)
COPYRIGHT 2015 TED RALL, DISTRIBUTED BY CREATORS.COM
Originally published by Breaking Modern:
The Obama Administration has announced plans for armed drone sales to foreign countries. What could possibly go wrong?
Originally published by ANewDomain.net:
Turns out wealthy art collectors in the US and Europe are inadvertently funding the Islamic State of Iraq and Syria (ISIS). They’re reportedly snatching up looted artwork from ISIS-controlled museums in Syria and Iraq. And ISIS is using the illegal sale of such looted museum pieces to finance its expanding war.
From the crappy:
to the sublime:
Originally published by Breaking Modern:
It’s too bad, but Baby Boomers continue to belie generational stereotypes. In a recent survey, they overwhelmingly say they either feel too healthy or too financially insecure to retire at the normal age 65. Even to the bitter end, they continue to overshadow Generation Xers and Millennials who need them to step aside gracefully and make room for them.
Workers hate hate hate open office spaces, which they say make them feel undignified, constantly distracted and cramps their personal space. But corporations think workers should just tolerate open spaces anyway. Oh, the joy of open office spaces.
Desire, the Indian philosopher Jiddu Krishnamurti taught, causes suffering.
I managed to make it half a century, and thus likely through more than halfway to death (which Arthur Schopenhauer teaches us, is the goal of life), not only by failing to internalize the belief that optimism breeds disappointment, but by passionately refusing to believe it. Without desire, I fervently believed, there is no motivation and thus no accomplishment.
Without ambition, how does one succeed in one’s work or find the love of one’s life? I know people who don’t want anything. They’re called potheads.
But I’ve changed my mind. The stoners may be on to something.
Give up hope — and you might find happiness. I did!
As I’ve read and heard often occurs with spiritual journeys, I arrived at my epiphany as the result of an unexpected accident.
Like other cartoonists, I apply for the Pulitzer Prize, America’s most prestigious journalism award, every January.
I hate it. Yet I do it.
I hate it because it’s a lot of work, the odds are long, and the choice of the winner is usually — to be diplomatic — baffling. Out of the 20-ish times I’ve entered, spending a full day or two each year printing out and pasting up cartoons and clips into a binder (and in the computer age, formatting and uploading them), not to mention 20-ish $50 application fees, all I have to show for my efforts is one finalistship. Back in 1996.
To datestamp this story: the letter was typed. As in: on a typewriter.
Like Charlie Brown trying to kick Lucy’s football, I apply for the Big P under the old New York Lotto dictum that you have to be in it to win it. What if the year I don’t enter is the year that I would have won?
Contest Judge #1 to other Judges: So that’s all the entries in the cartooning category.
Judge #2: Wait a minute. Where’s Ted Rall?
Judge #1: He didn’t apply.
Judge #2: WTF?
Judge #3: I specifically came here to give Ted Rall his long-overdue award!
Judge #1: Me too. I doublechecked. Tragically for journalism, he did not enter.
Judge #4: Can we call him?
Judge #1: That would be against the solemn Rules. We must choose from the other entries.
Judges #2-#4 commit suicide in interesting ways.
The deadline used to be January 30th, so I thought it still was, but they changed it a few years ago to the 25th God knows why. I blew the deadline.
As though carried off by a drone labeled “Short-Sighted Defense Policy,” a metaphorical weight bigger than a crosshatched albatross labeled “National Debt” lifted from my shoulders.
I didn’t enter. So I would not, could not, win.
Which meant I couldn’t be passed up in favor of someone else. To be precise, I couldn’t lose to someone I didn’t think was as good as me.
What a relief!
I really really really don’t mind losing to someone good. When someone good has won, I have been happy for the winner. I did not grit my teeth. I congratulated them, and meant it, and resolved to do better next year.
The problem is, the winner of the Pulitzer is usually very not good. Not as good as me. Not pretty good. Not even as good as average.
Losing to someone whose work I don’t respect hurts because it means either (a) the sucky winner is better than me, so therefore I suck even more, or (b) the Pulitzers are judged by dolts, so I must be an idiot to submit to the process, much less care about the results. I strongly suspect (b), though (a) could be true.
From late January, when I realized that I couldn’t enter, to early April, when they announced the results, I felt lighter on my feet. When my colleagues called to handicap the prize, my usual toxic mix of ambition, dread and fear of disappointment was replaced by the carelessness of knowing that I had no dog in the race and that whatever happened wouldn’t be a reflection upon me. So what if someone bad won? The judges never saw my stuff. So I wouldn’t have to spend weeks and months wondering how it was possible that anyone could look at the cartoons by the terrible winner next to mine and choose him instead of me.
I should confess that other cartoonists, no doubt smarter than me, arrived at this wisdom when they were younger. One, 10 years my junior, casually remarked that she gave herself a mini Pulitzer Prize every year by not entering: $50 a year adds up. Not to mention the time she saved compiling entries.
Last year’s winner turned out to be someone whose cartoons couldn’t possibly be any different than mine. Ditto for the finalists. Given who they chose, the judges weren’t interested in the genre of cartooning I do, so I would never have stood a chance.
Not entering was the right move. Or non-move.
This year, however, I remembered the deadline. To enter or not to enter? I entered.
Now I wish I hadn’t.
(Ted Rall, syndicated writer and cartoonist for The Los Angeles Times, is the author of the new critically-acclaimed book “After We Kill You, We Will Welcome You Back As Honored Guests: Unembedded in Afghanistan.” Subscribe to Ted Rall at Beacon.)
COPYRIGHT 2015 TED RALL, DISTRIBUTED BY CREATORS.COM
Originally published by ANewDomain.net:
Legacy news organizations are failing for a lot of reasons, mostly brought upon by themselves, but there’s one that rarely if ever gets remarked upon: the fact that they have forgotten the definition of “news.”
As you and I know, news is stuff that happened that a significant number of people would like to know about. By definition, news is surprising.
All too often in recent decades, however, corporate media conglomerates have conflated news with press releases – in other words, informing us not about what we need or want to know, but about what they would like us to know.
A major driver of this trend is the misguided belief by press and broadcast organizations that the powers that be – politicians, government agencies and businesses – create news and thus must be coddled, and have all their official pronouncements disseminated in the form of news, lest they be denied access, which would of course put an end to their ability to do their jobs.
One symptom of this too close for comfort relationship between the fourth estate and those it is supposed to cover is the willingness of outlets like the New York Times to suppress or delay stories at the request of intelligence agencies due to so-called “national security concerns.”
The idea that reporters need access to PR flacks is nonsense. The opposite is true: publicists need journalists. A press conference is a news-free zone, a place where spin and propaganda rules. Unfortunately for them and for us — since the vast majority of reporting still originates in corporate-owned newspapers — the trend is accelerating.
Check out, for example, this excuse for a news story: “Obama condemns ‘brutal and outrageous murders.’”
According to Google, this story – about the president’s reaction to the murder of three Muslim students in Chapel Hill, North Carolina – was reproduced over 78,000 times in American and foreign media outlets.
There is nothing wrong with what Obama said. To the contrary: I agree with him 100%. Most likely, so do you. So do 100% of sane Americans, which means perhaps 90% of all Americans. His reaction was the exact reaction that you would expect from anyone and, to my point, specifically from him.
In other words, a news “story” about President Obama saying that mass murder is bad (outside the context of, say, wars of choice and drone assassinations) is no story at all. It is “dog bites man.” And not a particularly interesting dog or a particularly interesting man.
You really have to question the judgment of those thousands of editors and producers who put that story out yesterday. Who, exactly, did they think that story served? Certainly not the readers or viewers or listeners. Not one of them was surprised; not one of them cared.
Every newsroom receives hundreds if not thousands of emails a day from people who want their story or product or person covered. Publishers want their books reviewed. Manufacturers want a free plug for their products. Agents want their pet musician profiled. The vast majority of them are, of course, ignored. Pertinent story: a friend who works at a major American newspaper tells me about the fax machine that no one ever checks, that runs 24 hours a day, endless press releases dumping straight into the recycled paper bin, totally pointless for all concerned. Yet I know for a fact that that same paper ran the story about Obama taking the bold risk of coming out against random mass murder. Why that story and not the others?
I’m not arguing that traditional media outlets ought to descend to Huffington Post’s SEO-optimized clickbait or BuzzFeed’s “18 ridiculously cute photos of insipid pets” listicles. But the Internet is certainly a lot better at knowing what people might actually want to read or see. Stories like the one above make that painfully obvious.
As an editor at the New York Times told me once, “the President of the United States controls the world’s largest armies and presides over the world’s largest economy. By definition, anything he says and does is news.”
They live by that attitude. They are also dying by it.