Ticker

6/recent/ticker-posts

Pegasus Surveillance Software || Vulnerable

THE PEGASUS PROJECT

For months, reporters from numerous countries have been researching where, how and against whom the espionage software Pegasus from the Israeli company NSO is used by secret services and police authorities around the world. The starting point was a list of more than 50,000 cell phone numbers from around 50 countries that was leaked to the non-profit organization Forbidden Stories and Amnesty International.

THE PARTNERS

Coordinated by Forbidden Stories, the following editorial offices were involved in the research: Le Monde and Radio France from France, The Washington Post from the USA, The Guardian from Great Britain, DIE ZEIT, Süddeutsche Zeitung , the NDR and WDR from Germany, Direkt36 from Hungary, Knack and Le Soir from Belgium, Haaretz from Israel, The Wire from India, Daraj Media from Lebanon, Proceso and Aristegui Noticias from Mexico. In addition, the investigative research platform OCCRP.

Amnesty International's Security Lab was responsible for analyzing the data. A detailed report on the methodology of the investigation and how the lab was able to detect traces of Pegasus on the cell phones can be found here. The results were confirmed in an independent study by the Canadian IT security laboratory Citizen Lab at the University of Toronto. The security researchers have been investigating cyber attacks on dissidents and journalists, including those involving Pegasus, for years. In their second report, they found that specific traces on the cell phones examined had so far only been observed on devices that were believed to have been infected with Pegasus.

THE DATA

According to Forbidden Stories, the cell phone numbers were fed into a system operated by the NSO Group, a company that sells its Pegasus spyware to police authorities and secret services around the world. The list ranges from 2016 to the present. The numbers were fed in from more than ten states that are customers of NSO.

Before a planned surveillance with Pegasus, the monitors usually identify whether a mobile phone is online and in which country it is located. According to the NSO, the use of Pegasus is not possible in all countries.

Read more: Cyber Weapons

The location at which a cell phone is registered with a cellular network operator and whether it is online is information that network operators around the world store in a system called the Home Location Register (HLR). In order for cell phone communication to work internationally, cell phone operators must, among other things, exchange data from this HLR with one another. Anyone who has access to this system can call up information on any cell phone worldwide. In addition to the phone number, this is also the hardware identification number of the SIM card.

The list of more than 50,000 numbers therefore includes queries from the HLR system. That doesn't mean that each of the 50,000 numbers was subsequently attacked or infiltrated by Pegasus. NSO values   the statement that it "has no access to the data of the target persons of its customers". Entering a number means at least that the police and secret services had an interest in the people to whom the cell phone number belongs.

THE PEGASUS TRACKS

The Pegasus Project succeeded in finding digital traces of the attack on the smartphones of various victims, even if the Trojan had deleted itself. This was possible because the reporters involved in the investigation contacted potential victims and asked them to provide their cell phone data for investigation by Amnesty's Security Lab. A total of 67 smartphones from people who appeared on the original list were analyzed by the Security Lab.

Read also: The Attack After The Murder

According to the Amnesty cyber experts, traces of Pegasus activities were found on a total of 37 devices: 23 cell phones were successfully infected with the malware and 14 showed evidence of an attempted attack with the cyber weapon. The program was therefore still active on some of the devices examined until July. Even the latest iPhones with the latest operating systems were affected. In 15 cases it could be proven that the devices had been infected with the Pegasus malware less than a minute after the query documented in the leaked data.

Every smartphone is vulnerable. How comprehensively attackable was shown by recent research by a journalist consortium made up of 17 editorial offices. With the Pegasus spyware software from the Israeli NSO Group, attackers can tap e-mails, social media posts, passwords, contacts, images, videos and the browsing history. You can also use it to activate the camera and microphone, bypass the encryption of chat messages or listen to calls. With the help of this software you can follow people in real time: where they are, who they are talking to, what is on their mind. Apparently, this works whether the person is using an Android smartphone or an iPhone.

More than 50,000 phone numbers were potential targets of such spying activities, including journalists, human rights activists, lawyers, diplomats and politicians. In theory, it could also affect almost any other smartphone user. Because the security gaps in an operating system that the NSO Group uses for its Pegasus software can also be exploited by other attackers.

Read more: French Head Of State "Emmanuel Macron" Was Also The Target Of Pegasus Espionage

It has long been known that the NSO Group's mobile hacking tools enable spying attacks on smartphones . But they are always finding new ways to crack smartphones. Are the manufacturers of smartphone operating systems and software, above all Apple and Google, doing enough to ensure the security of their users?

Not a standard attack

"Apple and Google are already setting the bar very high when it comes to IT security," says Michael Backes, founding director of the Helmholtz Center for Information Security (Cispa). For attackers, this means that even professional hackers cannot easily infiltrate operating systems or find vulnerabilities. Of course, no operating system is one hundred percent secure, says Backes.

You have to know: attacks with the help of the Pegasus software are not standard hacks, but involve enormous effort. They are very sophisticated, cost millions of dollars to develop, often have only a short lifespan and are used specifically against certain people, says Apple security chief Ivan Krstić. "While that means they are not a threat to the overwhelming majority of our users, we continue to work tirelessly to protect all of our customers and we are constantly adding new safeguards to their devices and data."

This is necessary because attackers are constantly working out new methods and angles of attack - regardless of how many protective measures the manufacturers take or how many vulnerabilities they close. This also applies, but not only, to Pegasus. Security researcher Backes says the same thing: "The problem with a program like Pegasus is that it does not always exploit the same vulnerability, but continuously relies on newly found vulnerabilities."The Pegasus software also makes use of so-called zero day exploits , i.e. the exploitation of security gaps that the manufacturers are not yet aware of. These as yet unknown vulnerabilities are very valuable for attackers because the software manufacturers have not yet developed any repair software to exploit them. As long as the vulnerabilities are not found, smartphones are vulnerable to them. And not only through spyware like Pegasus, but also through any other actor who knows about the vulnerability.

For this reason, attackers and manufacturers alike are prepared to pay a lot of money for knowledge of such previously unknown security gaps. Companies such as Google , Apple, and many others have launched bug bounty programs. This means that they sometimes pay high rewards to hackers and security researchers who report their knowledge of such security gaps to the manufacturers. This should help to identify weak points in the code as quickly as possible and to repair the software as quickly as possible. Above all, however, it is intended to prevent information about previously unknown vulnerabilities from being sold to other actors who want to exploit it for their own purposes.

Read also: Princess Latifa and Sheikh Mohammed Bin Rashid Al Maktum, ruler of Dubai

"It will hardly be possible to avoid zero-day exploits in the short term," says Backes. The manufacturers could currently only try to find security gaps as comprehensively as possible. To this end, manufacturers could, for example, consistently further develop analysis tools or expand their bug bounty programs.

Apple's bug bounty program in particular does not have a good reputation among security researchers; the company is considered stingy. "If you talk to the majority of IT security researchers, none of them have anything positive to say about Apple's bug bounty program," says Patrick Wardle, an American IT security researcher who specializes in Apple. The communication is bad and the sums paid are "very, very low".

At least if you put it in relation to what others pay for zero days and software developed on their basis. The NSO Group, for example, sold its software to countries such as Saudi Arabia . And depending on the level of spying interest on the part of state actors, very large sums of money can be paid for zero-day exploits themselves or for spying tools developed on the basis of them. Apple may not be able to compete with them. "What we'd like to see, however, are bug bounty programs that are at least close to what an expert could get," says Wardle.

Apple should also make it easier for security researchers to report vulnerabilities, says the expert. A former Apple employee told the Washington Post that it was difficult to communicate with security researchers who reported bugs in the software because the company's marketing department got in the way. “Marketing could always veto,” said the person. There were a number of ready-made answers that were used over and over again. That was annoying and slowed everything down.

Read more: India Prime Minister Narendra Modi's and spyware Pegasus software

Apple itself rejects any criticism of the lack of commitment to the security of its operating system. When asked, the company wrote that its bug bounty program was a great success. It pays millions of dollars to researchers and offers the highest payouts in the industry. It is untrue that Apple does not pay well. A marginal aspect of the success is that the processing times are partially delayed, but that work is being done to eliminate these bottlenecks. The marketing team only has a say in some interactions between Apple employees and external security researchers, and only so that the message is consistent with new products.

Uncertainty factor iMessage

The fact that Apple is currently more criticized than Google is also due to the fact that cases were known in which Pegasus spy software was played on iPhones. Amnesty International's Security Lab examined 67 smartphones whose numbers had been leaked to the research collective Forbidden Stories. 37 of these smartphones had forensic evidence that they had been infected with the Pegasus software or there was evidence of an attempted infection. 34 of them were iPhones, three of which were Android smartphones .

At Apple, the potential vulnerabilities through which Pegasus was smuggled onto iPhones are also easy to understand. The preinstalled chat service iMessage is said to have played a role in half of the successful and attempted attacks.

According to IT security researchers, one reason for the vulnerability of iMessage is that people there can simply send others a message without being asked. "If you know a person's phone number or Apple ID, you can simply send an exploit and Apple will kindly bring it to the target person," said Apple expert Wardle. By this he means that in some cases the attacked no longer even have to open a link or an attachment - a zero-click message, as such imperceptible contact attempts are called in specialist circles, is sufficient.

Read also: Former Mossad Chief's Eye-Opening Interview

It could also be done differently, as the messaging app Signal shows: In the app, people can only send requests to others, but the latter can reject them. This is a simple way to make iMessage more secure, says Wardle: by not simply being able to send you messages by strangers. Even today, iPhone users can filter messages, but they have to turn this function on explicitly.

Despite such criticism, Apple is not very contrite. "For more than a decade, Apple has been leading the industry in terms of security innovation and as a result, IT security researchers agree that the iPhone is the most secure consumer device on the market," said Apple's security chief Krstić about the Pegasus research.

Security researcher Wardle certifies Apple that they have built a good image with their marketing that the iPhone is difficult to hack. "But in reality, that's not really the case." The problem is that this marketing means that Apple users have excessive trust in their devices. But that was risky because they might then be less careful and more likely to click on links or download programs, because they assumed that their device was already somehow secured, says Wardle.

Different operating system, different problems

In fact, it would be just as negligent on the part of Android users to assume that their smartphones are better protected against the Pegasus spy software than iPhones. Amnesty's security researchers have not only examined fewer Android devices, but the logs they found from Android smartphones did not provide enough information about a possible infection, Amnesty researchers said. But: The lack of evidence for the use of the malware does not mean that the device was not infected.

Android is easily vulnerable to attack differently than iOS. Apple's software world is considered a closed system. This means that Apple not only publishes a small amount of information about the software, but also controls more closely who is allowed to move in the system - for example which apps are allowed in the app store. The advantage of this is that Apple can better guarantee data protection and security. For example, every app is checked before it is added to the app store.

Security researchers also criticize this system. "The security and privacy of the device can be used by sophisticated attackers to their advantage," says Wardle. An example of this is iMessage: The service is encrypted, which is good from a data protection perspective. However, this also means that an exploit cannot be discovered during transmission. "Nobody can examine this data," said Wardle.

Cispa boss Backes says: The closed system also has the disadvantage that far fewer weak points can be found in advance, as third parties have little insight. This is where iOS differs from Android: the source code of Android is largely open to the public. This also makes it possible for external security researchers to search for vulnerabilities in the code. Android's open system also has disadvantages, and criminals can also find vulnerabilities more easily, says Backes. Nevertheless, he thinks Android's approach to relying on greater transparency is "clearly better".

Android is not a safe fortress, however. Just because it is such a popular operating system, less elaborate hackers than the Pegasus buyers target it. According to IT security company Kaspersky, malware on Android devices made up the largest proportion of all mobile threats in 2018 . In 2020, the exploitation of security vulnerabilities increased seventeen times . The Google Play Store is also not considered to be as picky about the selection of apps as Apple's counterpart. Smartphone manufacturers can also customize Android according to their needs. In general, Cispa boss Backes says: "The more complex a system becomes, the more dependencies there are, the more third-party providers are involved, the more difficult it is to secure it."

Similar to Apple, Google also highlights its activities against attacks such as that of the Pegasus software when asked. The company writes that it is tracking a number of threatening attackers, including the NSO Group, which sells the Pegasus spy software. Every month, more than 4,000 warnings are sent to users informing them of attempts to infiltrate their accounts - either by state-sponsored attackers or other actors such as the NSO Group. In December, Facebook was supported in its legal battle with NSO and the potential damage caused by their practices was highlighted. Together with companies such as Microsoft and Cisco, Google supported a statement at the timewhich described the NSO Group's business model as "cyber surveillance as a service". Apple had not joined the companies.

After the Pegasus research was published, Apple and Google published or promised security updates. Users should now download it quickly to protect their smartphones.

It is unlikely that it will be possible to fundamentally prevent attacks like the one carried out by Pegasus. Even if the NSO Group could be stopped, another company would soon come to fill the void. Because with unknown security gaps - despite all the efforts of the companies - a trade has long since developed that is illegal, but also very profitable.

It is certainly not possible to hold companies solely responsible for the fact that very elaborate groups attack their operating systems and their smartphone software. What they could very well intensify, however, are their fundamental efforts to ensure IT security. Companies like Apple and Google could increase both the cost of vulnerabilities and the risk of exploitation - not everywhere, but at least on certain channels like iMessage, writes the renowned cryptologist Matthew Green of Johns Hopkins University in a blog post.

One possibility would be for companies to significantly increase their bug bounty programs again. To make it clear with a few figures: Apple recently had almost 200 billion US dollars in cash reserves, Google a good 130 billion US dollars. According to the Zerodium platform, you can earn around two million US dollars for iOS and 2.5 million US dollars for Android for a security hole that can be used to attack people without their intervention. Companies could easily double or triple their payments and would still have enough money in the account. This would also raise the financial hurdles for groups like the NSO Group. 

Another important point is that companies also plug security gaps that have become known as quickly as possible. And instead of always emphasizing how safe your own devices are, it would sometimes make more sense to point out potential weak points to users - so that they are and remain careful.

Just because companies like Apple and Google are already doing a lot doesn't mean that they have exhausted all of their options with today's versions of their products, writes security researcher Green. "There is definitely more that companies like Apple and Google could do to protect their users." But something would only change if this was also demanded from the companies. 

"That should be a wake-up call for security on the Internet," WhatsApp boss Will Cathcart told the Guardian after the Pegasus research was published. As early as 2019, there were attempts to spy on 1,400 WhatsApp users with Pegasus . Government officials, journalists and human rights activists were also affected at the time. "It is not enough to say that most users shouldn't have to worry. It is not enough to say, 'Oh, these are only thousands or tens of thousands of victims.'" When an attack affects journalists or human rights defenders around the world, then it affects us all, so Cathcart. "If anyone's smartphone is not secure, nobody's smartphone is safe."

Do You Know What We Have Posted on

Twitter Facebook Instagram Reddit tumblr

Post a Comment

0 Comments