13 programming languages defining the future of coding
13
programming languages defining the future of coding
1. R
2.
Java 8
3.
Swift
4.
Go
5.
Coffee Script
6. D
7.
Less.js
8.
MATLAB
9.
Arduino
10.
CUDA
11.
Scala
12.
Haskell
13.
Jolt
Scientists turn memory chips into processors to speed up computing tasks??
A team of
international scientists have found a way to make memory chips perform
computing tasks, which is traditionally done by computer processors like those
made by Intel and Qualcomm.
This means data
could now be processed in the same spot where it is stored, leading to much
faster and thinner mobile devices and computers.
This new computing
circuit was developed by Nanyang Technological University, Singapore (NTU
Singapore) in collaboration with Germany's RWTH Aachen University and
Forschungszentrum Juelich, one of the largest interdisciplinary research centers
in Europe.
It is built using
state-of-the-art memory chips known as Redox-based resistive switching random
access memory (ReRAM). Developed by global chipmakers such as SanDisk and
Panasonic, this type of chip is one of the fastest memory modules that will
soon be available commercially.
However, instead of
storing information, NTU Assistant Professor Anupam Chattopadhyay in
collaboration with Professor Rainer Waser from RWTH Aachen University and Dr
Vikas Rana from Forschungszentrum Juelich showed how ReRAM can also be used to
process data.
This discovery was
published recently in Scientific Reports.
Current devices and
computers have to transfer data from the memory storage to the processor unit for
computation, while the new NTU circuit saves time and energy by eliminating
these data transfers.
It can also boost
the speed of current processors found in laptops and mobile devices by at least
two times or more.
By making the memory
chip perform computing tasks, space can be saved by eliminating the processor,
leading to thinner, smaller and lighter electronics. The discovery could also
lead to new design possibilities for consumer electronics and wearable
technology.
How the new circuit
works
Currently, all
computer processors in the market are using the binary system, which is
composed of two states -- either 0 or 1. For example, the letter A will be
processed and stored as 01000001, an 8-bit character.
However, the
prototype ReRAM circuit built by Asst Prof Chattopadhyay and his collaborators
processes data in more than just two states. For example, it can store and
process data as 0, 1, or 2, known as a ternary number system.
Because ReRAM uses
different electrical resistance to store information, it could be possible to
store the data in an even higher number of states, hence speeding up computing
tasks beyond current limitations.
Asst Prof
Chattopadhyay who is from NTU's School of Computer Science and Engineering,
said in current computer systems, all information has to be translated into a
string of zeros and ones before it can be processed.
"This is like
having a long conversation with someone through a tiny translator, which is a
time-consuming and effort-intensive process," he explained. "We are
now able to increase the capacity of the translator, so it can process data
more efficiently."
The quest for faster
processing is one of the most pressing needs for industries worldwide, as
computer software is getting increasingly complex while data centres have to
deal with more information than ever.
The researchers said
that using ReRAM for computing will be more cost-effective than other computing
technologies on the horizon, since ReRAMs will be available in the market soon.
The excellent
properties of ReRAM like its long-term storage capacity, low energy usage and
ability to be produced at the nanoscale level have drawn many semiconductor
companies to invest in researching this promising technology.
The research team is
now looking to engage industry partners to leverage this important advance of
ReRAM-based ternary computing.
Moving forward, the
researchers will also work on developing the ReRAM to process more than its
current four states, which will lead to great improvements of computing speeds
as well as to test its performance in actual computing scenarios.
Journal Reference:
1.
Wonjoo Kim, Anupam
Chattopadhyay, Anne Siemon, Eike Linn, Rainer Waser, Vikas Rana. Multistate
Memristive Tantalum Oxide Devices for Ternary Arithmetic. Scientific
Reports, 2016; 6: 36652 DOI: 10.1038/srep36652
Chip-sized, high-speed terahertz modulator raises possibility of faster data transmission..
Tufts
University engineers have invented a chip-sized, high-speed modulator that
operates at terahertz (THz) frequencies and at room temperature at low voltages
without consuming DC power. The discovery could help fill the "THz
gap" that is limiting development of new and more powerful wireless
devices that could transmit data at significantly higher speeds than currently
possible.
Measurements
show the modulation cutoff frequency of the new device exceeded 14 gigahertzes
and has the potential to work above 1 THz, according to a paper published
online in Scientific Reports. By contrast, cellular networks occupy
bands that are much lower on the spectrum where the amount of data that can be
transmitted is limited.
The
device works through the interaction of confined THz waves in a novel slot
waveguide with tunable, two-dimensional electron gas. The prototype device
operated within the frequency band of 0.22-0.325 THz, which was chosen because
it corresponded to available experimental facilities. The researchers say the
device would work within other bands as well.
Although
there is significant interest in using the THz band of the electromagnetic
spectrum, which would enable the wireless transmission of data at speeds
significantly faster than conventional technology, the band has been
underutilized in part because of a lack of compact, on-chip components, such as
modulators, transmitters, and receivers.
"This
is a very promising device that can operate at terahertz frequencies, is
miniaturized using mainstream semiconductor foundry, and is in the same form
factor as current communication devices. It's only one building block, but it
could help to start filling the THz gap," said Sameer Sonkusale, Ph.D., of
Nano Lab, Department of Electrical and Computer Engineering, Tufts University,
and the paper's corresponding author.
Journal
Reference:
1.
P. K.
Singh, S. Sonkusale. High Speed Terahertz Modulator on the Chip Based
on Tunable Terahertz Slot Waveguide. Scientific Reports, 2017;
7: 40933 DOI: 10.1038/SREP40933
Patients' electrocardiograph readings would be used as an encryption key to access their medical records
Researchers at
Binghamton State University in New York think your heart could be the key to
your personal data. By measuring the electrical activity of the heart,
researchers say.
they can encrypt
patients' health records.
The fundamental idea
is this: In the future, all patients will be outfitted with a wearable device,
which will continuously collect physiological data and transmit it to the
patients' doctors. Because electrocardiogram (ECG) signals are already
collected for clinical diagnosis, the system would simply reuse the data during
transmission, thus reducing the cost and computational power needed to create
an encryption key from scratch.
“There have been so
many mature encryption techniques available, but the problem is that those
encryption techniques rely on some complicated arithmetic calculations and
random key generations," said Zhanpeng Jin, a co-author of the paper "A
Robust and Reusable ECGbased Authentication and Data Encryption Scheme for
eHealth Systems."
Those encryption
techniques can't be "directly applied on the energy-hungry mobile and wearable
devices," Jin added. "If you apply those kinds of encryptions on top
of the mobile device, then you can burn the battery very quickly."
But there are
drawbacks. According to Jin, one of the reasons ECG encryption has not been widely
adopted is because it's generally more sensitive and vulnerable to variations
than some other biometric measures. For instance, your electrical activity
could change depending on factors such as physical exertion and mental state.
Other more permanent factors such as age and health can also have an effect.
“ECG
itself cannot be used for a biometric authentication purpose alone, but it’s a
very effective way as a secondary authentication,” Jin said.
While
the technology for ECG encryption is already here, its adoption will depend on patients' willingness
to don wearables and on their comfort with constantly sharing their biometrics.
Apple, Google, and Uber join list of tech companies refusing to build Muslim registry..?
Apple,
Google, and Uber have all broken their respective silences on whether they
would participate in helping build a Muslim registry for the incoming Trump
administration, an Apple spokesperson said, “We think people should be treated
the same no matter how they worship, what they look like, who they love. We
haven’t been asked and we would oppose such an effort.”
Earlier
today, a Google spokesperson issued a statement saying, “In relation to the
hypothetical of whether we would ever help build a ‘Muslim registry’ — we
haven’t been asked, of course we wouldn’t do this and we are glad — from all
that we’ve read — that the proposal doesn’t seem to be on the table.”
Meanwhile, Uber responded with a terse “no” in response to a similar inquiry.
“WE ARE GLAD... THAT THE PROPOSAL DOESN’T SEEM TO BE
ON THE TABLE.”
These
are just the latest — but arguably among the most important and high-profile Silicon Valley players to go on record refusing to build a database that could
be used to track and target Muslim Americans. Pressure started mounting last
month when The Intercept began asking tech companies about
the subject and only received a response from Twitter, which said it would
never participate in such a project.
The
situation then heightened this week when a Facebook spokesperson, who had
initially refused to comment on the matter, accidentally emailed. The
email compared any statement regarding the building of a Muslim registry to a
“straw man” argument and suggested Facebook’s PR strategy should be to remain
silent. BuzzFeed published the email, which then forced Facebook to issue a statement saying
it had not been asked, nor would it agree, to helping build a Muslim registry.
Since
Facebook’s embarrassing stumble, a number of other tech companies have gone on
the record disavowing the highly controversial Trump campaign promise.
Microsoft PR head Frank X. Shaw said in a statement given to BuzzFeed,
“We oppose discrimination and we wouldn’t do any work to build a registry of
Muslim Americans.” Both Microsoft CEO Satya Nadella and Alphabet chief Larry
Page attended a summit with President-elect Donald Trump on Wednesday, as
did Apple CEO Tim Cook and Uber CEO Travis Kalanick.
APPLE AND UBER BOTH WENT ON THE RECORD AFTER GOOGLE
SPOKE UP.
Ride-hailing
company Lyft, which like Uber could hypothetically be asked to hand over user
travel data, said today it would refuse to participate with the government if
it were asked for such data or other tools to build a Muslim registry. One
notable exception here has been Oracle, the cloud computing giant that has in
the past counted the National Security Agency as a client. The company declined
to comment when asked about a Muslim registry or whether it still works with
the NSA. In a separate event, Trump yesterday appointed Oracle CEO Safra Catz
to the executive committee of his transition team.
Google Launches Cloud Bigtable, A Highly Scalable And Performant NoSQL Database....
With Cloud Bigtable, Google is launching a new NoSQL database offering today that, as the name implies, is powered by the company’s Bigtable data storage system, but with the added twist that it’s compatible with the Apache HBase API — which itself is based on Google’s Bigtable project. Bigtable powers the likes of Gmail, Google Search and Google Analytics, so this is definitely a battle-tested service
Google promises that Cloud Bigtable will offer single-digit millisecond latency and 2x the performance per dollar when compared to the likes of HBase and Cassandra. Because it supports the HBase API, Cloud Bigtable can be integrated with all the existing applications in the Hadoop ecosystem, but it also supports Google’s Cloud Dataflow.
Setting up a Cloud Bigtable cluster should only take a few seconds, and the storage automatically scales according to the user’s needs.
It’s worth noting that this is not Google’s first cloud-based NoSQL database product. With Cloud Datastore, Google already offers a high-availability NoSQL datastore for developers on its App Engine platform. That service, too, is based on Bigtable. Cory O’Connor, a Google Cloud Platform product manager, tells me Cloud Datastore focuses on read-heavy workload for web apps and mobile apps.
“Cloud Bigtable is much the opposite — is designed for larger companies and enterprises where extensive data processing is required, and where workloads are more complex,” O’Conner tells me. “For example, if an organization needs to stream data into, run analytics on and serve data out of a single database at scale – Cloud Bigtable is the right system. Many of our customers will start out on Cloud Datastore to build prototypes and get moving quickly, and then evolve towards services like Cloud Bigtable as they grow and their data processing needs become more complex.”
The new service is now available in beta, which means it’s open to all developers but doesn’t offer an SLA or technical support.
Big Data & Hadoop Career Analysis
Market
research and advisory firm Ovum estimates the big data market will grow from
$1.7 billion in 2016 to $9.4 billion by 2020. As the market grows, enterprise
challenges are shifting, skills requirements are changing, and the vendor
landscape is morphing. The coming year promises to be a busy one for big data
pros. Here are some key predictions for big data in 2017 from industry watchers
and technology players.
·
The era
of ubiquitous machine learning has arrived
·
When
data can’t move, bring the cloud to the data.
·
Applications,
not just analytics, propel big data adoption.
·
The
Internet of Things will integrate with enterprise applications.
·
Data
virtualization will light up dark data.
·
A boom
in prepackaged integrated cloud data systems.
·
Cloud-based
object stores become a viable alternative to Hadoop HDFS.
·
Next-generation
compute architectures enable deep learning at cloud scale.
·
Hadoop security
is no longer optional.
·
Big
data becomes fast and approachable.
·
Organizations
leverage data lakes from the get-go to drive value
·
The
convergence of IoT, cloud, and big data create new opportunities for
self-service analytics.
·
Self-service
data prep becomes mainstream as end users begin to shape big data
·
Self-service
analytics extends to data prep Analytics will be everywhere, thanks to embedded
BI.
·
IT
becomes the data hero.
·
Artificial
intelligence is back in vogue.
·
Companies
focus on business-driven applications to avoid data lakes from becoming swamps.
·
Data
agility separates winners and losers.
· Block chain transforms select financial service applications
Subscribe to:
Posts (Atom)