Dspl.ca Homepage

Welcome to our little spot on the Internet. Finally got it back to the basics. This site is made only from a few PHP files and a flat file directory structure.

Things will get broken, and things will get better. Cheers.

Old School News Feeds (RSS)

Slashdot

Pearson Ditches Print Textbooks For College Students in Digital-First Strategy
Posted on Thursday January 01, 1970

Texbook publishing giant Pearson will soon be publishing a lot fewer textbooks. It said this week it's ending regular revisions of all print textbooks in its higher-education category. As Pearson faces mounting pressure from the resale market, the move signals a growing shift in the publishing industry to a "digital-first" model. From a report: Instead of revising all 1,500 of its active titles every three years according to the print schedule, the British education publisher said it will focus on updating its digital products more frequently, offering artificial intelligence capabilities, data analytics and research. Pearson is billing the decision as a way to help drive down college costs for students. But the company and the education publishing industry as a whole have been criticized for years for the rising prices of textbooks. That has pushed a majority of students into secondhand textbook markets like Chegg or spurred them to forego buying class materials altogether. The average cost of college textbooks rose about four times faster than the rate of inflation over the last decade. "Our digital first model lowers prices for students and, over time, increases our revenues," Fallon said in a statement. "By providing better value to students, they have less reason to turn to the secondary market. Pearson's e-books can cost about $40 on average and go up to $79 for additional learning tools like homework assistance. That compares to prices that can go as high as $200 or $300 for a print textbook, according to Pearson CEO John Fallon, though students can still rent one for $60 on average.

Read more of this story at Slashdot.

Justice John Paul Stevens, Dead At 99, Promoted the Internet Revolution
Posted on Thursday January 01, 1970

Former Supreme Court Justice John Paul Stevens passed away Tuesday evening of complications following a stroke he suffered on July 15. He was 99 years old. An anonymous Slashdot reader shares a lightly edited version of Ars Technica's 2010 story that originally marked his retirement from the Supreme Court: In April 2010, the Supreme Court's most senior justice, John Paul Stevens, announced his retirement. In the weeks that followed, hundreds of articles were written about his career and his legacy. While most articles focus on 'hot button' issues such as flag burning, terrorism, and affirmative action, Stevens' tech policy record has largely been ignored. When Justice Stevens joined the court, many of the technologies we now take for granted -- the PC, packet-switched networks, home video recording -- were in their infancy. During his 35-year tenure on the bench, Stevens penned decisions that laid the foundation for the tremendous innovations that followed in each of these areas. For example, Stevens penned the 1978 decision that shielded the software industry from the patent system in its formative years. In 1984, Hollywood's effort to ban the VCR failed by just one Supreme Court vote; Stevens wrote the majority opinion. And in 1997, he wrote the majority opinion striking down the worst provisions of the Communications Decency Act and ensuring that the Internet would have robust First Amendment protections. Indeed, Justice Stevens probably deserves more credit than any other justice for the innovations that occurred under his watch. And given how central those technologies have become to the American economy, Stevens' tech policy work may prove one of his most enduring legacies. In this feature, we review Justice Stevens' tech policy decisions and salute the justice who helped make possible DRM-free media devices, uncensored Internet connections, free software, and much more. As the report mentions, Stevens was the Supreme Court's cryptographer. "Stevens attended the University of Chicago, graduating in 1941. On December 6 -- the day before the Japanese attacked Pearl Harbor -- Stevens enrolled in the Navy's correspondence course on cryptography." "Stevens spent the war in a Navy bunker in Hawaii, doing traffic analysis in an effort to determine the location of Japanese ships," the report adds. "He was an English major, not a mathematician, but he proved to have a knack for cryptographic work."

Read more of this story at Slashdot.

Apollo 11 Had a Hidden Hero: Software
Posted on Thursday January 01, 1970

"Monday's Wall Street Journal includes a special Apollo 11 feature," writes Slashdot reader Outatime in honor of the 50th anniversary since Apollo 11's Saturn V launched from the Kennedy Space Center. "[O]f particular interest to many Slashdot nerds is the piece on the pioneering computer hardware and software that took three astronauts, and landed two, on the moon." Here's an excerpt from the report: The [MIT Instrumentation Laboratory or I-Lab] was housed in a former underwear factory overlooking the Charles River, now long since demolished. The Apollo engineers and programmers labored at scuffed metal desks in cubicles with code scribbled on the chalkboard, slide rules on the table, cigarette butts on the linoleum floor. Fanfold computer printouts were stacked up to 6 feet high, like termite mounds. The lab had pioneered inertial guidance systems for the nuclear-warhead-tipped missiles of the Cold War, such as the submarine-launched Polaris intercontinental ballistic missiles. Funded by the U.S. Air Force, it also developed a plan in the late 1950s to fly a computerized probe to Mars and back. MIT received the first major Apollo contract, the only one awarded to a university, and the only one given without competitive bidding. In an era when a computer used fragile tubes, ran on punch cards and filled an entire room, the I-Lab engineers had invented a briefcase-size digital brain packed with cutting-edge integrated circuits and memory so robust it could withstand a lightning bolt -- a direct ancestor of almost all computers today. Unlike other machines of its era, it could juggle many tasks at once and make choices of which to prioritize as events unfolded. Apollo missions carried two of these computers, one aboard the command module and one in the lunar lander, running almost identical software. Only the lunar lander, though, required the extra code to set down safely on the moon.

Read more of this story at Slashdot.

Elon Musk Unveils Neuralink's Plans For Brain-Reading 'Threads' and a Robot To Insert Them
Posted on Thursday January 01, 1970

Neuralink, the secretive company developing brain-machine interfaces, held a press conference today where it unveiled some of the technology it's been developing to the public for the first time. The first big advance is flexible "threads," which are less likely to damage the brain than the materials currently used in brain-machine interfaces and create the possibility of transferring a higher volume of data. "The threads are 4 to 6 micrometers in width, which makes them considerably thinner than a human hair," reports The Verge. The other big advance that Neuralink unveiled is a machine that automatically embeds the threads into the brain. From the report: In the future, scientists from Neuralink hope to use a laser beam to get through the skull, rather than drilling holes, they said in interviews with The New York Times. Early experiments will be done with neuroscientists at Stanford University, according to that report. The company aims for human trials as soon as the second quarter of next year, according to The New York Times. The system presented today, if it's functional, may be a substantial advance over older technology. BrainGate relied on the Utah Array, a series of stiff needles that allows for up to 128 electrode channels. Not only is that fewer channels than Neuralink is promising -- meaning less data from the brain is being picked up -- it's also stiffer than Neuralink's threads. That's a problem for long-term functionality: the brain shifts in the skull but the needles of the array don't, leading to damage. The thin polymers Neuralink is using may solve that problem. However, Neuralink's technology is more difficult to implant than the Utah Array, precisely because it's so flexible. To combat that problem, the company has developed "a neurosurgical robot capable of inserting six threads (192 electrodes) per minute [automatically]," according to the white paper. In photos, it looks something like a cross between a microscope and a sewing machine. It also avoids blood vessels, which may lead to less of an inflammatory response in the brain, the paper says. Finally, the paper says that Neuralink has developed a custom chip that is better able to read, clean up, and amplify signals from the brain. Right now, it can only transmit data via a wired connection (it uses USB-C), but ultimately the goal is to create a system than can work wirelessly. Currently, the company is testing the robot and threads on rats, but it's hoping to actually begin working with human test subjects as early as next year. Story is developing...

Read more of this story at Slashdot.

US Heat Waves To Skyrocket As Globe Warms, Study Suggests
Posted on Thursday January 01, 1970

An anonymous reader quotes a report from USA Today: As the globe warms in the years ahead, days with extreme heat are forecasted to skyrocket across hundreds of U.S. cities, a new study suggests, perhaps even breaking the "heat index." By 2050, hundreds of U.S. cities could see an entire month each year with heat index temperatures above 100 degrees if nothing is done to rein in global warming. The heat index, also known as the apparent temperature, is what the temperature feels like to the human body when relative humidity is combined with the air temperature. This is the first study to take the heat index -- instead of just temperature -- into account when determining the impacts of global warming. The number of days per year when the heat index exceeds 100 degrees will more than double nationally, according to the study, which was published Tuesday in the journal Environmental Research Communications. On some days, conditions would be so extreme that they'd exceed the upper limit of the heat index, rendering it "incalculable," the study predicts. What is there to be done about this? "Rapidly reduce global warming emissions and help communities prepare for the extreme heat that is already inevitable," report co-author Astrid Caldas said. "Extreme heat is one of the climate change impacts most responsive to emissions reductions, making it possible to limit how extreme our hotter future becomes for today's children."

Read more of this story at Slashdot.

Proudly powered by a Text Editor and some Internet Searches.

 

 

2017 dspl.ca end of file.