Wednesday, April 29, 2015

Bridge to the past: OpenLegacy's plan to bring mainframes into the 21st century

Summary:Tel Aviv startup OpenLegacy is bringing  the world of ancient mainframes together with modern web-based computing environments.
By Niv Lilien for Tel Aviv Tech

Across many industries, legacy hardware - including mainframes - is still common. Whether it's financial services, insurance, aviation, or car rental businesses, a lot of 'old economy' companies are
using relics from an earlier era of computing. AS/400s, Unix-based systems, and similar hardware are still up and running in businesses across the world. They often underpin mission critical functions, and execute millions of transactions in seconds - and are all broadly brought together under the term 'legacy computing'.

Getting those computational dinosaurs to work with modern environments like cloud and mobile can be a serious headache for the IT department.

OpenLegacy, a Tel Aviv-based startup, is targeting this market with an open source product intended to bridge the gap between legacy systems and modern web-based environments, as well as ERP, CRM, and even Internet of Things systems. The company was formed by a team with a background in integrating legacy and modern systems. Romi Stein, OpenLegacy's CEO and co-founder, has 15 years under his belt at IBM, while Roi Mor, OpenLegacy's CTO and co-founder, has spent a similar amount of time spearheading complex modernization projects in various environments.

OpenLegacy's product is a Java based API, which creates an ORM layer that maps the legacy resources and assigns a Java object to each one, be it an application, a screen, or a database. Interfacing with legacy devices is being done either by RPC, SQL, or Telnet, while OpenLegacy uses its own connectors, adding new ones as needed.

To that structure, OpenLegacy adds object-oriented target accelerators for web, mobile, or another server or API. As a byproduct, OpenLegacy's "magic box" also creates an elementary UI and front end, allowing for full web experience. Being an open source company, OpenLegacy is offering its basic product for free. "It's for noob developers as well", says Stein. "It's almost drag and drop."

OpenLegacy also provides a high level of analysis automation. "We can take source code, a line of recorded screens or a whole interface - which makes a lot of an integrator's work obsolete," said Mor.

OpenLegacy also offers automatic testing capabilities. "A lot of these legacy modernization projects
failed because you changed something in the backend, and forgot to change it in the frontend - so you start getting crashed and exceptions. We offer automatic testing, with no need for manual testing - if something is wrong, you get a flag," says Stein.

Stein takes pride in the fact that OpenLegacy isn't your regular startup. OpenLegacy, despite being only a 20 employee company, already has a variety of projects running, with the Israeli Airport Authority, two local insurance companies Migdal and Shomra, and Delek Motors - the Israeli importer of BMW, Ford, and Mazda vehicles.

According to Stein, for one customer, the company managed to create an interface that cut down the response time from three seconds per inquiry to 300 milliseconds in less than a week, following a previous attempt that took the customer six months.

In a similar manner, OpenLegacy built a basic web interface for a credit card company's mainframe systems, displaying consumer transactions for the corporate website in 48 hours. It also took Delek Motors' 120 AS/400 screens and business processes to full integration with Microsoft's Dynamics CRM in three months.

Other than gaining momentum in the SMB sector for its relatively low cost and rapid deployment time, OpenLegacy's plan is to expand its integrators chain, and add more layers of integration to legacy computing.

OpenLegacy's ambition is to be the middleware gate through which all legacy interaction will be handled. "Legacy computing isn't equipped to handle directly the enormous stress coming from web services," Ze'ev Avidan, OpenLegacy's VP of product management, said.

OpenLegacy is aiming to tackle three of those stresses: first, security, concerning queries with understanding of specific services; and second, in workload terms, by filtering only the relevant queries that the legacy computer can handle - for example, location services polling ten times a minute.

The third one is management-related. As situations get more complex, when you discover an exploit or a problematic piece of data, if it's concerted in one place, you can easily fix it on site.

OpenLegacy's Mor sums up the company's current state with a reference to his time at IBM. "In days past, the AS/400 guys used to see companies like us as a threat. Today they take us everywhere they go."

Monday, April 27, 2015

Brazil to Build World’s Largest Floating Solar Farm Amidst Devastating Drought

With Brazil’s historic drought drying up its hydroelectric plants, the South American country is turning to solar power to help relieve its foreboding energy crisis.
 By Lorraine Chow ,EcoWatch

Brazil’s devastating drought has depleted its reservoirs causing the nation to consider alternative energy options besides hydropower, which supplies more than 75 percent country’s power. 
The nation announced that within four months, it will commence pilot tests of a gigantic floating solar farm located atop the Balbina hydroelectric plant in the Amazon. It’s currently unclear how physically large the floating farm will be, but the enormous reservoir it will sit oncovers 2,360 square kilometers.

At 350 megawatts, Brazil’s ambitious project would easily trump Japan’s currently largest 13.4 megawatt floating solar power plant in terms of power output. To put that in another perspective, the largest solar farm in the world is the 550 megawatt Desert Sunlight Solar Farm in California.

Diversifying energy sources is clearly a necessity for the notoriously parched country. Brazil is experiencing its worst drought in four decades, causing electricity blackouts in many regions. Below-average rainfall in the last few years have depleted its reservoirs, thus gutting its formerly plentiful supply of hydropower, which supplies more than three-quarters of the country’s electricity, according to the U.S. Energy Information Administration.
As Climate News Network reported, “the reservoirs in the drought-affected region could fall to as little as 10 percent of their capacity, which … Mines and Energy Minister Eduardo Braga admits would be ‘catastrophic’ for energy security.”

While the sunny country has tremendous potential for solar power, Brazil has been slow to embrace this form of renewable energy. It was only in October 2014, when Brazil made its first foray into this sector with the construction of 31 solar parks, its first large-scale solar project with a combined capacity of 1,048 megawatts.

A shift to solar energy might be fitting, as the Balbina Dam (where the proposed solar farm will eventually sit) has been criticized for emitting more greenhouse gases than a coal-fired power plant.
“We are adding technological innovation, more transmission lines, diversifying our energy generation source, introducing solar energy in a more vigorous manner and combining solar energy with hydroelectric energy,” Braga told reporters about the solar farm project.

“We are preparing ourselves to win the challenge in 2015 and be able to deliver a model and an electric system starting in 2016 which will be cheaper, more secure and with greater technological innovation,” Braga said. Electricity produced at the farm is expected to cost between $69 and $77 per megawatt hour, reports say.

Tuesday, April 14, 2015

March Newsletter - Big Data Analytics and Cyber-Security

Cyber security analytics is rapidly becoming a Big Data application for one simple reason, large organizations are collecting, processing and analyzing more and more data in order to effectively address the new cyber threat landscape.
Big Data Analytics and Cyber-Security is Vega’s topic of the month.
You can find our monthly technical review at page 3, and few examples of Israeli solutions for this topic at page 5.



A CHANGING THREAT LANDSCAPE
The cyber threat landscape has changed dramatically over the last 5 years. The new industrialization and internationalization of digital criminality combined with the limited legal responses available have enabled the dramatic growth and convergence of both simple and
sophisticated attacks.


It is now generally accepted, not only in the security profession but also in the Boardroom, that every organization will be attacked in one form or another on a regular basis. Some of those attacks will inevitably succeed. This is driving changes in the defensive stance of leading firms, where a greater emphasis is now being placed on identifying and limiting damage from successful attacks. Previously many firms had the stated goal of preventing all possible attacks.
As a direct result, effective and efficient security operations have become a key cyber defense capability within many leading organizations. Innovative and leading commercial organizations are now building increased security monitoring and security analytics capabilities to sit alongside effective threat intelligence and critical incident management capabilities. Their goal is to predict, to limit and to manage the inevitable attacks they will face.

INCREASED MONITORING LEADS TO BIG DATA
Cyber security analytics is rapidly becoming a Big Data application for one simple reason: large organizations are collecting, processing and analyzing more and more data in order to effectively address the new cyber threat landscape.
The promise of Security Information & Event Management (SIEM) technologies was to deliver advanced analytics capabilities. The reality is that SIEM products weren’t
designed for Big Data analytics and generally cannot meet the rapidly evolving needs that leading commercial organizations now demand.
SIEM does provide a good foundation for security monitoring in providing a near real-time signature or rules-based detection capability to look for known threats. SIEM is also great for compliance and reporting. However, SIEM does not scale to detect the unknown threats across all the available data. Data often has to be pre-filtered before being loaded in to a SIEM. This effectively presupposes where the risk lies. SIEM cannot do the advanced security analytics that are required today.
It is likely the SIEM platforms and the current range of Big Data cyber security analytics platforms will move towards convergence over the next five years. However, this is new ground for both groups of vendors and for now separate products remain necessary to achieve the full potential benefits of each.

BEHAVIORAL ANALYTICS FOR DETECTION
Behavioral analytics understand past human behavior, predict future behavior and identify anomalous behavior. Behavioral analytics have been used extensively in fraud detection and prevention because different individuals naturally display different behaviors and legitimate behaviour is practically always different from that exhibited by a fraudster.

Behavioral analytics takes advantage of this fact. Rather  than just looking for specific indicators, behavioral analytics combines knowledge with monitoring to determine if behavior is expected and legitimate, or suspicious.
Behavioral analytics is a Big Data challenge not only because of the volumes of data involved, but also because of the need to bring a wide variety of data sources and formats together to create a full picture. Cyber security analytics is increasingly adopting behavioral Analytics, from the fraud detection field, in order to address the reality that traditional security solutions have proven ineffective against the incredible variety and volume of digital criminality, such as cyber espionage, cyber crime, hacktivism and the insider threat. This emerging cyber and fraud threat detection convergence means that the benefits realized from behavioral analytics in combating both will increasingly drive even greater operational efficiencies and investment decisions.
Behavioral analytics are proving to be more robust, enduring and effective than traditional signature and rules based analytics. Figure 1  demonstrates the contrast between the two. In short, an organization using behavioral analytics will find anomalies that other point solutions and systems cannot.

A BIG DATA PLATFORM FOR INVESTIGATION
The operational cost of the technology and people required to effectively detect, triage and investigate security incidents is too high. Limits on data collection, non-interoperable tooling and subsequent data mining means that once suspicious indicators have been identified it can take weeks for a cyber-investigator to collect and analyze the data from across a large enterprise required to identify the appropriate response.
Big Data platforms can enable faster query times and a more seamless approach for security analysts retrieving and analyzing data across multiple sources and formats.
In most cases Big Data Cyber security solution will comprises of three core components:

  • Platform - Massively scalable technology platform that correlates data acquired from across the IT infrastructure
  • Analytics - Behavior-based threat detection using unique attack models 
  • Investigator - Powerful threat intelligence management and investigation toolset providing visualizations, rich contextualization and correlation of threats, indicators, events and alerts.


SUMMARY
As a result of the rapidly changing and expanding cyber threat landscape many organizations have increased their security monitoring capabilities both in terms of volume and variety of data. As such Cyber Security analysis is now becoming a Big Data problem for both the detection and investigation of incidents.

Monday, April 13, 2015

The Man Who Can Save The World From Wasting Water

Innovator Amir Peleg and his company TaKaDu have revolutionised the Israeli water network
By Amanda Little GCC

Amir Peleg hunches his broad, 6-foot-3-inch frame into a tunnel leading to one of several reservoirs that supply water to Jerusalem. Condensation collects on the ceiling, inches overhead, like thousands of tiny stalactites. Peleg, an entrepreneur whose self-given job title is “chief plumbing officer”,
catches a droplet on his palm. “Literally every drop counts,” he says. “This is the modern-day Gihon.”
Gihon was the ancient, intermittent spring that made human settlement possible in Jerusalem circa 700 BC. Today, fresh water sources in Israel and the surrounding region are more precious than they were in the Bronze Age. About 1 million residents continually draw water from this reservoir, which is filled by pipelines snaking from the Sea of Galilee 145 kilometres north. Located at the edge of Jerusalem, the reservoir is held in a massive underground vault patrolled by armed guards to keep insurgents from poisoning the supply. Thick cement walls surround a floodlit pool of water, ghostly and luminous, 40 feet deep and wider and longer than two football fields.

Until TaKaDu came along, the water-utility world was almost deaf and blind

Like most of its neighbours, Israel is a desert nation, and during the past seven years it’s struggled through a drought with  record-low rainfall. In response, Peleg and others have come up with an array of innovations, from microscopic sewage scrubbers to supersize desalination plants to smart water networks. Israel now has higher agricultural yields than it’s had in non-drought years. It even has a water surplus, a portion of which (about 150  million cubic metres per year) it pumps to Jordan and the Palestinian Authority.
“I don’t think it’s overkill to say that Israeli entrepreneurs are disrupting and reinventing how the world creates and conserves water,” says Peleg, 48. He’s become one of the leaders of a water-tech movement that began in the 1950s, when Israel’s first prime minister, David Ben-Gurion, implored scientists and engineers to “make the desert bloom”.

Tuesday, April 7, 2015

Brazil Water, Water … Where?

Brazil's most populous region facing worst drought in 80 years
 by Michael Royster

Brazil’s Northeast has long suffered from periodic droughts, particularly in the semi-arid region known as the “sertão”. That has prompted tens of thousands of people there to pack up and migrate south to São Paulo and Rio de Janeiro, where jobs and water were in abundance.
Or, at least there was water until the drought of 2014/15 which struck Brazil’s Southeast. Just as in the sertão, the rainy season (December, January, February) has been inordinately dry. Reservoirs have shrunk to alarming levels, which the papers headline every day.

“Volume morto” has become a well-known phrase; literally “dead water”, it means the water at the bottom of a reservoir, below the level of the gates that release water for consumption. In fact, São Paulo has now claimed to have found third and fourth levels of “dead water”. That’s a bit like saying, they’ve hit rock bottom and are still digging.

There are potential problems with “dead water” — for instance no one knows if it’s really safe, since it’s been sitting down at the bottom collecting God knows what for decades. Moreover, in order to use this water, it has to be pumped upwards – that requires using more energy which requires … more water.
According to the politicians running these states, the problem is not serious. There’s no need for rationing as there was back in 2001, they say, because, well, because… it’s going to rain a lot in February and March! Phew! What a relief! Everyone can now breathe easily and get on with the important business of wondering whose mask to wear during Carnival.

All is not bad news, however. During the last few days, the Governors of São Paulo and Rio de Janeiro have begun to show an interest in desalinization of sea water. Desalinization, although it’s very expensive, might make sense for Rio, because the largest population centers are at sea level. For São Paulo, however, any sea water would have to be pumped 800 meters uphill to reach the twenty million consumers in Greater São Paulo.

During Lula’s first term, the government began a megaproject to divert water from the São Francisco River into the sertão, so that Nordestinos would not have to migrate southwards any more. The principal source of the São Francisco is in Minas Gerais, which is also drought-stricken. Expect political battles, because landlocked Minas Gerais will want to keep most of the São Francisco water for itself.

Do not expect rationing — that would be rational. After heavy rains for a few days, Rio and São Paulo governors praise rainmaking Saint Peter and revert to being ostriches, plunging their heads once again into the sand. That’s easier to do now, there’s lots of sand in the newly dry riverbeds and reservoirs.

There’s a traditional Carnival song whose refrain is: “De dia falta água, de noite falta luz!” (“No water by day, no power by night!”) You’ll hear that a lot this Carnival.

Thursday, April 2, 2015

Wearable device slows brain tumor growth

TTFields therapy, developed by an Israeli scientist, delivers low-intensity alternating electric fields via a scalp device to inhibit cancer cells.
Bu Abigail Klein, leichman, Israel21C

A novel wearable device, already used on nearly 2,000 patients to slow the growth of cancerous glioblastoma brain tumors using electrical fields, is now being tested to judge its effectiveness against other types of solid tumors.
Novocure Chief Science Officer Eilon Kirson tells ISRAEL21c that the 15-year-old company’s Tumor Treating Fields (TTFields) technology is being tested on ovarian and pancreatic cancer patients and patients with cancers that have spread to the brain.
At the same time, Novocure is involved in trials to see if TTFields can extend the life of even more patients with glioblastoma, the most common form of primary brain cancer in adults. Approximately 10,000 new cases are diagnosed in the United States each year.
“Electric field-based therapy had never been used to treat cancer beyond very local therapies,” Kirson explains. “Treating entire parts of the body this way is a completely novel concept and technology, and there is no other one like it. Novocure owns the entire IP portfolio for the science and the product.”
TTFields therapy, developed by Dr. Yoram Palti, now a retired Technion professor, is delivered via a non-invasive electrode device attached to the patient’s scalp. The low-intensity alternating electric fields have been clinically proven to slow and even reverse tumor growth by inhibiting the process of cells division and replication.
“If you think of standard cancer therapy – surgery, radiation and chemo – TTFields can be considered the fourth modality to be used independently or added to the other modalities to achieve better outcomes,” says Kirson. “As we move forward and do more clinical trials, we believe that it may be widely applicable in many types of cancer.”

Extending life of glioblastoma patients

Most of Novocure’s published data relates to TTFields’ effectiveness and safety in treating glioblastoma patients whose tumors have recurred after initial treatment.
Results of multicenter trials begun in 2006 proved the device just as effective as chemo in extending patients’ overall survival, but with lower toxicity and better quality of life.

In 2011, US Food and Drug Administration (FDA) approval enabled Novocure to launch its Optune delivery system for TTFields in patients with recurrent glioblastoma after chemotherapy.
More than 150 neuro-oncology centers in the United States are now trained in the technology, which is available by prescription.

The company is now working with the FDA to obtain a new approval to treat newly diagnosed glioblastoma patients, together with standard chemo, based on the results of a phase III trial that began in 2009. In November last year, monitors recommended ending the trial early because it had already demonstrated TTFields’ efficacy. The monitors urged that patients in the control group be allowed immediate access to the treatment, and the FDA approved this crossover.

Last year, Novocure launched Optune in Europe — mainly in Germany, Switzerland and Austria — for treating recurrent and newly diagnosed glioblastoma. In addition, a regulatory application was filed in Japan.

Effective against other cancers

“Since we believe TTFields are applicable to a wide variety of solid tumors in the body, we are beginning to test them in other indications,” says Kirson.
Patient enrollment has just been completed in a phase II clinical trial of TTFields together with one chemotherapy drug in 20 patients with newly diagnosed advanced pancreatic cancer. Another 20 will be enrolled in a cohort to be treated with TTFields and two chemo drugs for pancreatic cancer, the fourth-leading cause of cancer-related death in the United States and Europe.
“We are proud to take part in a clinical trial testing a novel approach for the treatment of pancreatic cancer,” said Dr. Fernando Rivera, senior medical oncologist at the Santander University Hospital in Spain, who has treated five patients so far.
“This regional, non-invasive treatment that acts on dividing cancer cells has the potential to make a real change in the treatment paradigm for many of the 340,000 patients diagnosed with pancreatic cancer worldwide every year, mostly at an advanced, non-curable stage,” said Rivera.

Novocure plans a phase III clinical trial this year to test the ability of its technology to prevent a recurrence in patients who underwent stereotactic radio-surgery after non-small-cell lung cancer spread to the brain.
In a recently published pilot study, 42 non-small-cell lung cancer patients in Switzerland were treated with TTFields from 2008 to 2010 in combination with standard chemotherapy.
“We are also doing smaller-scale studies in mesothelioma, an aggressive cancer in the chest,” Kirson says, “as well as a trial in recurrent ovarian cancer in several European centers, where we will treat patients together with the standard chemo regimen.”



The multinational company has more than 200 employees in six countries, about 40 of whom work in Novocure’s Haifa research-and-development center.