Skip to content

CyberSecurity Institute

Security News Curated from across the world

Menu
Menu

Category: Uncategorized

Industrialisation Of Hacking Will Dominate The Next Decade

Posted on December 8, 2009December 30, 2021 by admini

· A move from application to data security as cyber-criminals look for new ways to bypass existing security measures and focus on obtaining valuable information.
· Increasing attacks through social network sites where vulnerable and less technically savvy populations are susceptible to phishing attacks and malware infection.
· An increase in credential theft/grabbing attacks. As the face value of individual credit card records and personal identity records decreases (due to massive data breaches) attackers look at more profitable targets. Obtaining application credentials presents an up sell opportunity as they provide a greater immediate value to stolen data consumers up the food chain.

· A move from reactive to proactive security as organisations move from sitting back and waiting to be breached, to actively seeking holes and plugging them as well as trying to anticipate attacks before they come to realisation.

Application owners need to get their act together and tackle these trends head on. Organisations serious about protecting data will need to address not only the application level but also at the source of data. This will mean introducing of new technologies including a Database Firewalls, File Activity Monitoring, and the next generation of DLP products. These tools should also be combined together with other technologies such as Web Application Firewalls and classic DLP solutions to allow organization to keep track of dataflow across the enterprise from source to sink.

He sees the automation of hacking as a major issue and technical measures will be needed to combat this trend.

Organisations must look to integrate their protection tools with proactive security measures, admittedly not readily available today, however the security community is currently developing solutions and these will become widely available over the next few years.

The next decade must see the IT security industry rise up and stand shoulder to shoulder if it is to win the fight against cyber-criminals.

­Botnet growers / cultivators whose sole concern is maintaining and increasing botnet communities ­ Attackers who purchase botnets for attacks aimed at extracting sensitive information (or other more specialized tasks) Cyber criminals who acquire sensitive information for the sole purpose of committing fraudulent transactions As with any industrialisation process, automation is the key factor for success.

Indeed we see more and more automated tools being used at all stages of the hacking process.

Proactive search for potential victims relies today on search engine bots rather than random scanning of the network.

Massive attack campaigns rely on zombies sending a predefined set of attack vectors to a list of designated victims.

Attack coordination is done through servers that host a list of commands and targets.

SQL Injection attacks, “Remote File Include” and other application level attacks, once considered the cutting edge techniques manually applied by savvy hackers are now bundled into software tools available for download and use by the new breed of industrial hackers.

Search engines (like Google) are becoming an increasingly vital piece in every attack campaign starting from the search for potential victims, the promotion of infected pages and even as a vehicle for launching the attack vectors themselves.

In the last few days, Imperva tracked and analysed a compromise that affected hundreds of servers injecting malicious code into web pages, these were cross referenced with keywords that scored highly in Google search engine generating traffic and thus creating drive by attacks.

The scale of this attack, and others like it, is enormous and would not be achievable without total automation at all stages of the process.

Organisations must realize that this growing trend leaves no web application out of reach for hackers.

Attack campaigns are constantly launched not only against high profile applications but rather against any available target.

Protecting web applications using application level security solutions will become a must for larger and smaller organisations alike.

End users who want to protect their own personal data and avoid becoming part of a botnet must learn to rely on automatic OS updates and anti-malware software.

Previously attracting student communities, the growing popularity of social networking sites, such as Facebook, Twitter and LinkedIn is fast infiltrating mainstream populations with practically every man, and his dog, now ‘on Facebook’.

Elderly people as well as younger children, people who did not grow up with an inherent distrust in web content may find it very difficult to distinguish between messages of true social nature and widespread attack campaigns.

Attackers will also take advantage of the social networking information made accessible by social platforms to create more credible campaigns (e.g. make sure you get your Phishing email from your grandchildren).

The capabilities offered by the social platform and their growing outreach into other applications (webmail, online games) allow attacker to launch huge campaigns with a viral nature and at the same time pinpoint specific individuals.

Much like searching through the Google search engine for potentials target applications, attackers will scan social networks (using automated tools) for susceptible individuals, further increasing the effectiveness of their attack campaigns.

An entire set of tools that would allow us to evaluate and express personal trust in this virtual society are yet to be developed and put to use by platform owners and consumers.

Even when considering manually executed fraud, it is evident that having multiple sets of valid credentials for an online trading application makes it much more easier than having the personal data of account owners.

Consumers should protect themselves mainly from Trojan and KeyLogger threats by using the latest anti-malware software.

To date the security concept has been largely reactive — waiting for a vulnerability to be disclosed; creating a signature (or some other security rule) then cross referencing requests against these attack methods, regardless of their context in time or source.

http://www.businesscomputingworld.co.uk/?p=2017

Read more

Choosing SIEM: Security Info and Event Management Dos and Don’ts

Posted on December 2, 2009December 30, 2021 by admini

1. Security event management (SEM): Analyzes log and event data in real time to provide threat monitoring, event correlation and incident response. Data can be collected from security and network devices, systems and applications.

2. Security information management (SIM): Collects, analyzes and reports on log data (primarily from host systems and applications, but also from network and security devices) to support regulatory compliance initiatives, internal threat management and security policy compliance management.

Traditional SEM vendors have responded by orienting products previously geared toward real-time event alerting and management toward log management functionality. For instance, ArcSight added its Logger appliance and additional deployment options to address compliance. Meanwhile, SIM players such as SenSage and LogLogic are adding real-time capabilities.

Jon Oltsik, an analyst at Enterprise Strategy Group, sees the market differently. The main driver, he says, is the need to keep up with security complexity. “There is an acute awareness that security attacks are more sophisticated and that security at a system level is harder than at the device level,” he says. Compliance is the second most important factor, he says, and the third is the need to replace early SIEM platforms that don’t scale or provide the right level of analytics and reporting capabilities.

Forrester expects consolidation among the 20-plus SIEM vendors in the next 12 to 36 months, as well as more cloud-based SIEM services.

Core Capabilities
According to Gartner, five critical capabilities differentiate SIEM products, whether you use them for SEM, SIM or both.
This includes functions that support the cost-effective collection, indexing, storage and analysis of a large amount of information, including log and event data, as well as the ability to search and report on it.
Reporting capabilities should include predefined reports, ad hoc reports and the use of third-party reporting tools.
Key capabilities include user and resource access reporting.
This includes real-time data collection, a security event console, real-time event correlation and analysis, and incident management support.

The need for compliance has encouraged smaller security staffs to adopt SIEM, and these buyers need predefined functions and ease of deployment and support over advanced functionality and extensive customization.

Large volumes of event data will be collected, and a wide scope of analysis reporting will be deployed. This calls for an architecture that supports scalability and deployment flexibility.
Access Monitoring. This capability defines access policies and discovers and reports on exceptions. It enables organizations to move from activity monitoring to exception analysis. This is important for compliance reporting, fraud detection and breach discovery.

SIEM DOs and DON’Ts DO include multiple stakeholders.
When developing requirements, be sure to collect them from the range of groups that may benefit from collected log data. This includes internal auditors, compliance, IT security and IT operations.
There are certainly customers just looking for log management because of a compliance requirement, and they may not have the internal resources to do anything but collect and document logs, Kavanaugh says. “But many buyers realize the capabilities inherent in log management software—the ability to collect, search and run reports—are valuable to security operations.” Once the security group gets involved, he says, they look at including network security devices, routers and other areas of the network environment where they don’t have great insight, as well as the real-time component.

When selecting a SIEM product at Liz Claiborne, Mike Mahoney, manager of IT security and compliance, involved architecture leaders from eight groups, asking them to respond to an in-depth questionnaire regarding what would help them improve their jobs. It ultimately took six months to complete the evaluation. “I wanted this to be a tool they would benefit from beyond log collection,” Mahoney says.
“Ultimately, the point of intersection is log management, but analytics might be done by two different platforms,” Oltsik says. “Whether you need security or compliance, you’re using the same log data.”

Correlation is a key aspect of SIEM systems, says Larry Whiteside, associate director of information security at the Visiting Nurse Service of New York (VNSNY). SIEM systems normalize logs from various systems, which helps you see the most important data you need out of those logs in a readable format. They also help you correlate events that the human eye could never perceive but that correlation rules can detect. “If you use correlation rules, you can run a report, and two events that are 10 minutes apart will be right on top of each other because they’re directly related to each other,” White­side says. He can also look at specific databases on specific servers and see who’s touching them. Or he can get log events to see what applications are talking to other applications and what database tables they’re hitting.For instance, if Server A is talking to Server B, and activity peaks on Sunday night at 10 p.m., he can drill in further to see what desktops are involved.

While software is the traditional form factor, Kavanaugh says, vendors have increasingly come out with all-in-one appliances, which do the data collection, analysis and correlation and use their own built-in databases to store copies of logs.

There are also many blended offerings, in which a server performs the real-time analysis, correlation and monitoring, and an appliance covers log collection.
Cincera warns that hardware and software accounts for one-half or less of the total cost of ownership of a SIEM implementation. The rest, he says, is the labor involved with creating, building and deploying the technology. “You can’t just put someone on the console and have them whip up 10 good correlation rules a day,” he says. “They need to understand things like, ‘These events need to be treated in this manner, or with this level of discretion.’ ” This requires the governance function to specify which events to care about and what actions to take. … There’s a cost to the organization based on that function,” Cincera says.

Another cost is maintenance, which includes keeping rules up to date, group management, permissions, alerting, monitoring and metrics. “You need to manage interfaces to upstream systems, things that feed information to the engine,” Cincera says. “You need to stay constantly involved, making sure connections stay in sync with one another, and that can be a daunting effort.” The work level grows dramatically based on the number of upstream systems you need to feed, he warns. “Every event you choose not to ignore is one on which you must act, even if it’s just to say, ‘noted,’ ” Cincera says. At some point, Cincera says, the rules, alerts and actions you take lose value and should be decommissioned.

Total cost of ownership is something no vendor is good at communicating, he adds. “They don’t want you to think of all those costs.”

http://www.csoonline.com/article/509553/Choosing_SIEM_Security_Info_and_Event_Management_Dos_and_Don_ts

Read more

Six Steps Toward Better Database Security Compliance

Posted on October 10, 2009December 30, 2021 by admini

1. Database Discovery And Risk Assessment Before organizations can start their database compliance efforts, they must first find the databases — and where the regulated data resides in them.
“That’s a big challenge for a lot of folks. They know where their mainframes are, and they know where a lot of their systems are but…they don’t really know which database systems they have on their network,” says Josh Shaul, vice president of product management for Application Security, a database security company.

2. Vulnerability And Configuration Management Once an inventory has been developed, organizations need to look at the databases themselves.
“Basic configuration and vulnerability assessment of databases is a key starting point for enterprises,” Shaul says.

3. Access Management and Segregation of Duties Figuring out who has access to regulated data, what kind of access they are given, and whether that access is appropriate for their jobs is at the heart of complying with regulatory mandates. “Sometimes it’s as simple as account management, password controls, and removing default accounts,” Laliberte says. Organizations need to be vigilant to constantly review roles and entitlements to prevent toxic combinations of privileges. Take, for example, a payments clerk who gets a promotion to run the accounts payable department. In the new position, that person “owns” the AP system and has the ability to modify and delete checks that have been written.

4. Monitoring Risky Behaviors And Users Unfortunately there is a built-in segregation of duties violation in every database — and it’s one you can’t get rid of, Shaul says.
“Databases in general don’t give you the ability to take away DBAs’ data access away from them,” Shaul says. “And that’s what auditors are coming in and flagging folks for, saying, ‘First and foremost, you’ve got this easy-to-find segregation of duties violation. This exposure is one reason why database activity monitoring is so critical to enterprises seeking to satisfy regulatory requirements. Unfortunately, all too many organizations fail to log, track, or monitor database activity because they worry that such monitoring may affect database performance. DBAs and other database stakeholders should know that today’s third-party monitoring tools aren’t nearly as burdensome to database performance as in years past, experts say.

5. Reporting On Compensating Controls In those instances where organizations have appropriate compensating controls in place, auditors want proof that these controls actually exist, Laliberte says.

6. Following Defense-In-Depth Strategies Finally, it is important to remember to keep a little perspective on the matter of database security and compliance.
“This is really just a piece of what has to be a pretty large security program that’s going to allow you to meet these regulations,” says Mike Rothman, senior vice president of strategy for eIQnetworks, a security information and event management company.

Phil Lieberman, president of Lieberman Software, a password management company, believes this is one of the biggest database risks of all. The data may be secure on the server, but if someone with ill intent gets hold of the unencrypted tape, then it will be compromised all the same.

http://www.darkreading.com/story/showArticle.jhtml?articleID=220600156

Read more

Five Ways To Meet Compliance In A Virtualized Environment

Posted on September 3, 2009December 30, 2021 by admini

“It’s a good idea to talk about the intersection between compliance and security…. A lot of compliance regulations are written assuming the systems are physical — and that only certain administrators have rights to physical systems,” says Jon Oltsik, senior analyst at Enterprise Strategy Group.

“What if financial information sits on a virtual system and on a system with other [applications running on it]? If a financial application runs as a VM on a physical system, where do the access controls need to be? How are the regulations going to change to accommodate that?”

And compliance doesn’t always equal security — just take a look at some of the biggest data breaches of late. Virtualization adds another dimension to that problem. “You can have compliance without security and security without compliance,” Oltsik says.

Configure the virtualization platform, both the hypervisor and administrative layer, with secure settings, eliminate unused components, and keep up-to-date on patches. Virtualization vendors have their own hardening guidelines, as does the Center for Internet Security and the Defense Information Systems Agency, according to RSA and VMware.

“Virtualization infrastructure also includes virtual networks with virtual switches connecting the virtual machines. All of these components, which, in previous systems, used to be physical devices are now implemented via software,” states the RSA and VMware best practices guidelines. Extend your current change and configuration management processes and tools to the virtual environment, as well.

Server administrators should have control over virtual servers and network administrators, over virtual networks, and these admins need to be trained in virtualization software in order to avoid misconfiguration of systems. “Careful separation of duties and management of privileges is an important part of mitigating the risk of administrators gaining unauthorized access either maliciously or inadvertently.”

Deploy virtual switches and virtual firewalls to segment virtual networks, and use your physical network controls in the virtual networks as well as change management systems.

Monitor virtual infrastructure logs and correlate those logs across the physical infrastructure, as well, to get a full picture of vulnerabilities and risks.

http://www.darkreading.com/security/management/showArticle.jhtml;jsessionid=HQVORXCLBU4A3QE1GHRSKHWATMY32JVN?articleID=219501096

Read more

Expert Names Top 10 Audit Issues of 2009

Posted on May 7, 2009December 30, 2021 by admini

During this economic downturn, many companies will face disgruntled employees and will need to be able to control their access.

“Specific attention items should be: timely removal of access, periphery security, internal security architecture, physical security and badge location, help desk procedures, workstation security and IDS management,” Juergens said.

Many help desks and incident response teams will be understaffed, and Juergens advised that now is a good time to re-examine security procedures.

Enterprise search tools are more powerful than before, but auditors must “review data classification schema, access management, index design and maintenance, and user training,” said Juergens.

IT organizations must have contingency plans in place in case a partner fails and must be able to monitor the status of the entire supply chain, including that part of it that is outside the company.

For those organizations pursuing green IT initiatives, auditors must monitor their effectiveness and their compliance with local and federal law.

http://www.internetnews.com/bus-news/article.php/3819156/Expert+Names+Top+10+Audit+Issues+of+2009.htm

Read more

NIST suggests areas for further security metrics research

Posted on March 9, 2009December 30, 2021 by admini

“Security metrics is an area of computer security that has been receiving a good deal of attention lately,” the agency said in the draft of the new interagency report, titled “Directions in Security Metrics Research…. Advancing the state of scientifically sound, security measures and metrics would greatly aid the design, implementation, and operation of secure information systems,” the report states.

Formal Models of Security Measurement and Metrics: “The absence of formal security models and other formalisms needed to improve the relevance of security metrics to deployed systems have hampered progress.”
Historical Data Collection and Analysis: “Predictive estimates of the security of software components and applications under consideration should be able to be drawn from historical data collected about the characteristics of other similar types of software and the vulnerabilities they experienced.

At the very least, insight into security measurements would likely be gained by applying analytical techniques to such historical collections to identify trends and correlations, to discover unexpected relationships and to reveal other predictive interactions that may exist.”

Practicable Concrete Measurement Methods: “The current practice of security assessment, best illustrated by lower level evaluations under the Common Criteria, emphasizes the soundness of the evaluation evidence of the design and the process used in developing a product over the soundness of the product implementation.

Under the Federal Information Security Management Act, the CSD is responsible for providing agencies with standards, specifications and guidance in implementing requirements of the act.

Toward that end, NIST issued 18 special publications offering management, operational and technical security guidance, and has updated several Federal Information Processing Standard publications covering hash algorithms and digital signatures.

http://gcn.com/articles/2009/03/09/nist-security-metrics.aspx

Read more

Posts navigation

  • Previous
  • 1
  • …
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • …
  • 40
  • Next

Recent Posts

  • AI/ML News – 2024-04-14
  • Incident Response and Security Operations -2024-04-14
  • CSO News – 2024-04-15
  • IT Security News – 2023-09-25
  • IT Security News – 2023-09-20

Archives

  • April 2024
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • April 2023
  • March 2023
  • February 2022
  • January 2022
  • December 2021
  • September 2020
  • October 2019
  • August 2019
  • July 2019
  • December 2018
  • April 2018
  • December 2016
  • September 2016
  • August 2016
  • July 2016
  • April 2015
  • March 2015
  • August 2014
  • March 2014
  • August 2013
  • July 2013
  • June 2013
  • May 2013
  • April 2013
  • March 2013
  • February 2013
  • January 2013
  • October 2012
  • September 2012
  • August 2012
  • February 2012
  • October 2011
  • August 2011
  • June 2011
  • May 2011
  • April 2011
  • February 2011
  • January 2011
  • December 2010
  • November 2010
  • October 2010
  • August 2010
  • July 2010
  • June 2010
  • May 2010
  • April 2010
  • March 2010
  • February 2010
  • January 2010
  • December 2009
  • November 2009
  • October 2009
  • September 2009
  • June 2009
  • May 2009
  • March 2009
  • February 2009
  • January 2009
  • December 2008
  • November 2008
  • October 2008
  • September 2008
  • August 2008
  • July 2008
  • June 2008
  • May 2008
  • April 2008
  • March 2008
  • February 2008
  • January 2008
  • December 2007
  • November 2007
  • October 2007
  • September 2007
  • August 2007
  • July 2007
  • June 2007
  • May 2007
  • April 2007
  • March 2007
  • February 2007
  • January 2007
  • December 2006
  • November 2006
  • October 2006
  • September 2006
  • August 2006
  • July 2006
  • June 2006
  • May 2006
  • April 2006
  • March 2006
  • February 2006
  • January 2006
  • December 2005
  • November 2005
  • October 2005
  • September 2005
  • August 2005
  • July 2005
  • June 2005
  • May 2005
  • April 2005
  • March 2005
  • February 2005
  • January 2005
  • December 2004
  • November 2004
  • October 2004
  • September 2004
  • August 2004
  • July 2004
  • June 2004
  • May 2004
  • April 2004
  • March 2004
  • February 2004
  • January 2004
  • December 2003
  • November 2003
  • October 2003
  • September 2003

Categories

  • AI-ML
  • Augment / Virtual Reality
  • Blogging
  • Cloud
  • DR/Crisis Response/Crisis Management
  • Editorial
  • Financial
  • Make You Smile
  • Malware
  • Mobility
  • Motor Industry
  • News
  • OTT Video
  • Pending Review
  • Personal
  • Product
  • Regulations
  • Secure
  • Security Industry News
  • Security Operations
  • Statistics
  • Threat Intel
  • Trends
  • Uncategorized
  • Warnings
  • WebSite News
  • Zero Trust

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
© 2025 CyberSecurity Institute | Powered by Superbs Personal Blog theme