Full Article Here
Chalmers University of Technology - Researchers at the Chalmers University of Technology in Sweden say they have moved one step closer to a possible paradigm shift for the electronics industry. The researchers want to use graphene and terahertz waves in electronics in order to improve future data traffic. Graphene enables electrons to move much faster than in most conventional semiconductors, and this permits developers to access frequencies 100 to 1,000 times higher than gigahertz, constituting the terahertz range. "Data communication then has the potential of becoming up to 10 times faster and can transmit much larger amounts of data than is currently possible," says Chalmers' Andrei Vorobiev. The researchers have shown graphene-based transistor devices could receive and convert terahertz waves. The team is currently working to replace the silicon base on which the graphene is mounted, which limits the performance of the graphene, with other two-dimensional materials that can offset these limitations and enhance the effect.
Full Article Here
CNN Money, May 2
Data visualization is emerging as an increasingly popular area for hiring managers, especially as companies search for new ways to make sense of vast, amorphous amounts of data and to present the numbers in ways that are clear, colorful, and interactive. As companies come to rely more and more on data-driven decisions, numeracy will be as crucial as literacy for anyone who wants to stay employable. Although the field of quantitative analytics is still in its infancy, the real stars at many companies are already "hybrid" employees, who can use data to make better decisions in fields like marketing and who can translate the figures into graphics other people can get excited about.
For anyone serious about a career in data visualization, it's important to start learning a new tool. In short, Excel is not the only means of representing data. There is plenty of powerful data visualization software that is available for free, including Google Docs and Datawrapper. Experiment with a couple of these until you're comfortable with how they work and what they can do. Brush off that dusty statistics book. With data rapidly becoming the lingua franca of business, a basic understanding of probability and statistics certainly won't hurt. Also, play around with numbers and practice looking at huge sets of figures and extracting a meaning from them that you can then represent visually. An amazing variety of data gets published online by government agencies and others.
Design is a popular topic these days, especially in data visualization, where it's essential to making information accessible. Being able to speak intelligently about design, and showing that you know how to use it, puts you at an advantage over pure quantitative analysts. Once you've become adept at presenting figures in visual terms, you'll find that not everyone in an organization wants to accept what the data are saying, especially if they don't like numbers or resist basing decisions on them. So you have to know how to tell a story, and how to sell what your data visualization is showing. This often requires thinking about your audience when it comes to choosing how much detail you want to share with them.
Campus Technology (05/01/14)
Code.org and the New York City Foundation for Computer Science Education (CSNYC) plan to use the Bootstrap curriculum to help educators learn how to teach students algebraic and geometric concepts with computer programming. The two nonprofits will use middle-school lessons within schools and districts where they have a presence. Code.org and CSNYC promote adding computer science classes to schools starting in early grades. The curriculum is free and aligns with Common Core math standards. Launched as a 10-week after-school program, Bootstrap is now transitioning to become an in-school program in which students learn a programming language and other concepts and create a game. "The whole curriculum is a sequence of steps that get you to the point where you have a working game at the end," says Brown University professor and Bootstrap co-developer Shriram Krishnamurthi. "Once we tell [students] they're going to make their own game, the motivation is done."
Stephen Hawking: 'Transcendence Looks at the Implications of Artificial Intelligence--But Are We Taking AI Seriously Enough?'
The Independent (United Kingdom) (05/01/14)Today's advances in artificial intelligence (AI) research will pale in comparison to what the next decade will bring, write Stephen Hawking, Stuart Russell, Max Tegmark, and Frank Wilczek. They say success in advancing AI would be the biggest event in human history, as AI could provide tools for eradicating war, disease, and poverty. Looking further ahead, there are no fundamental limits to what can be achieved, and an explosive transition is possible, although it might play out differently from what is depicted in popular entertainment. The authors warn AI's development could lead to machines with superhuman intelligence outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons people cannot understand. The short-term impact of AI depends on who controls it, but the long-term impact depends on whether it can be controlled at all. Facing potential futures of incalculable benefits and risks, the authors say experts are not doing everything possible to ensure the best outcome and they note little serious research is devoted to these issues outside certain nonprofit institutes. They say everyone in the field should ask themselves what they can do to improve the chances of reaping the benefits and avoiding the risks.
HPC Wire (04/17/14) Tiffany Trader
University of Wisconsin professor Mark Hill is working to make computers more efficient by finding hidden efficiencies in their architecture, an increasingly necessary focus for computer engineers as Moore's law approaches its practical limits. The performance of computer tasks is one area of concentration, and Hill times such tasks to determine overall speed and the duration of each individual step. He was successfully able to use paging selectively through utilization of a simpler address translation method for certain components of important applications. Cache misses were thus reduced to less than 1 percent, and such a solution would enable a user to do more with the same framework, shrinking their server requirements and saving money. "A small change to the operating system and hardware can bring big benefits," Hill observes. He advocates a more unified approach, and he believes the slowdown in Moore's law can be countered by sufficient numbers of hidden inefficiencies. "I think we're going to wring out a lot of inefficiencies and still get gains," Hill says. "They're not going to be like the large ones that you've seen before, but I hope that they're sufficient that we can still enable new creations, which is really what this is about."
Mashable, April 12
For those looking to land a job at a hot tech startup or a coveted spot with an established technology company, the job search scene has an up-and-coming competitor to the traditional career fair: hackathons. With that in mind, Readyforce, a career network for college students, aims to hone in on hackathons as an outlet for job seekers. The platform streamlines the process of connecting students with companies and organizations recruiting those with computer science and computer engineering backgrounds. Readyforce's new platform, HackerHub, launched this spring as a one-stop shop for student leaders and companies with post-grad opportunities.
Hackathons are ideal hunting grounds for companies looking to land top talent straight out of school. They are often more appealing than a traditional career fair, as companies can send their engineers to an event and get a first-hand glimpse at potential candidates and their skills. Students in the upper echelons of computer science or engineering fields are often highly sought after. These are the doers, the students doing things outside of the classroom -- they're taking the initiative to start their own projects. They're the kind of students you can hire who, in some cases, are as experienced as an engineer who has years of experience. Companies seeking to hire such talent may want to consider hackathons as a way of making an impression on these candidates, and determining which students are the best fit for their companies.
Computerworld, January 10
Expertise in mobile, Web development and big data will be much sought-after by IT hiring managers in 2014. Software development clients are especially interested in Web developers who have Java skills or Ruby and Python development backgrounds. In the mobile space, experience with developing for Apple's iOS leads the way, followed by demand for Google Android and Windows Phone. With data analysis driving corporate spending, up-and-coming IT roles include data engineers and data scientists.
Tech hiring in 2014 will also include filling more traditional roles beyond just the most in-demand jobs. For example, companies are looking for people with software quality assurance backgrounds as they attempt to add those positions after cutting them during the recession. IT security followed the same pattern and employers are now encountering a lack of security talent. Skills involving network administration, Windows administration and desktop support will be in demand in 2014. In addition, the stronger economy will mean an increase in project manager and business analyst hiring as companies add or expand projects to handle the work uptick.
However, this need for tech talent also contributes to the industry's hiring struggles. IT's low unemployment rate, generally agreed to be between 3 and 3.5 percent, indicates that a majority of IT workers have jobs since hiring professionals view 2 percent unemployment as full employment. By comparison, the overall U.S. unemployment rate for December was 6.7 percent. The demand for tech workers may grow stronger in 2014: Salaries are rising, reflecting the supply and demand in the marketplace.
InfoWorld (02/03/14) Peter Wayner
The New York Times (01/26/14) Nick Bilton
Holodecks similar to the simulated-reality rooms seen on "Star Trek" could be available by 2024, according to some scientists and researchers. Computer companies, Hollywood, and video-game makers want to move entertainment closer to reality by enabling users to see things and allow people to move around their living rooms and become part of the story. The technology could enable gamers to step inside a computer-simulated Yankee Stadium, for example, and pick up a computer-simulated bat and hear the roar of a computer-simulated crowd. Advanced Micro Devices has built a version of a holodeck that is shaped like a dome, covered with wall-to-wall projectors, and uses surround sound, augmented reality, and other technologies to recreate the real world. The U.S. Army Research Laboratory has created a floor, called an omnidirectional treadmill, that enables users to seemingly wander while it moves but they stay in place. Meanwhile, Microsoft has built the IllumiRoom and Lightspace, while the University of Illinois at Chicago has created CAVE2. Gaming appears to be the driving technology that could disrupt the TV market and business travel, and cause users to prefer life in a virtual world. "Our desire for more realistically spattered blood seems to be our saving grace in terms of keeping Moore's Law going," says futurist Brad Templeton.
Computerworld (01/16/14) Sharon Gaudin
Google researchers are developing a smart contact lens that uses tiny chips, sensors, and antennas to continuously test diabetics' blood sugar levels. The technology uses wireless chips and miniaturized glucose sensors to measure glucose levels in the user's tears. "At GoogleX, we wondered if miniaturized electronics--think chips and sensors so small they look like bits of glitter, and an antenna thinner than a human hair--might be a way to crack the mystery of tear glucose and measure it with greater accuracy," according to the project's founders. "We're testing prototypes that can generate a reading once per second." The researchers also are studying the potential of the lenses to serve as an early warning for wearers when glucose levels get too low. "This type of 'in-eye' technology is the precursor to having Google Glass directly in our eyes," says analyst Patrick Moorhead. "To many, this is fascinating and inspiring. To others it is creepy and scary." He notes the project could offer insights into the future development of Google Glass. "If you project this forward a few years and add a flexible display, a display controller, and a radio that can talk to your smart watch, then you have Google Glass of the future." Full Article