You are reading the article Cios Still Concerned By Wireless Standards, Security updated in November 2023 on the website Tai-facebook.edu.vn. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested December 2023 Cios Still Concerned By Wireless Standards, Security
NEW YORK — Enterprise buying decisions on wireless technologies are still hampered by security concerns and a lack of standardization, panelists at a CIO/CTO panel on unlocking enterprise mobility said at the CeBIT America show here Thursday afternoon.
“We believe that there is tremendous opportunity, there just hasn’t been enough investment in the standardization of the infrastructure,” said Martin P. Colburn, CTO and executive vice president of private-sector financial regulatory services provider NASD.
Colburn noted that availability remains a large concern, restricting NASD’s utilization of wireless technology to applications that are not time sensitive; applications that can run in batch-mode rather than those that require real-time transactions. Additionally, security considerations get in the way.
“We would not connect the markets to these devices,” he said.
But the upside of going mobile remains high, particularly in enterprises that rely on large numbers of mobile workers. Trucking and transport logistics company Schneider National, which operates a fleet of 15,000 tractors and 45,000 trailers, made that realization early on, when it turned to a little-known company in the late 1980s to outfit its tractor fleet with two-way data connections through a geosynchronous satellite.
“Wireless is a critical technology for us,” said Paul Mueller, vice president, Tech Services, of Schneider National. “We were Qualcomm’s first customer.”
By putting wireless technology in its tractors, the firm was able to streamline communications with its drivers, improve dispatch processes, and track its fleet. Mueller said that when the company made the decision to equip its tractors with the technology, there was little historical information available with which to quantify the potential return on investment (ROI). However, it was clear that the old way of doing things were inefficient at best, because the drivers would line up in queues at payphones in truck stops to communicate with headquarters. The technology allowed the company to implement an automated dispatch cycle which allows it to send new dispatches and directions to drivers instantly, and get real-time updates on when shipments are picked up and when they are delivered.
But now the firm is looking to add tracking technology to its fleet of trailers, and that is proving to be more of a headache. The ROI case is much clearer this time, as the company has been able to rely on its 15 years of experience with its trailers. But over the past few years it has settled on several different technologies, only to have the supplier pull the product from the market or go belly-up.
“You’re going to be in a one-off product until you get some standardization,” Colburn said. He added, “There’s a tremendous amount of heterogeneity out there.” Without some sort of standardization, he said, organizations will be locked into a particular vendor and at the mercy of any difficulties it may experience.
Mueller also noted that coverage remains an issue. “We care about the North American road network and rail network,” he said. “There is a significant lack of coverage.”
From this experience, Mueller said he has drawn the lesson that it is essential to abstract the technology from the business process when selecting a new technology, allowing the technology to come and go while the process remains fundamentally the same.
“We’re going to see the technology iterate and evolve over time,” he said. “We have to abstract the technology from the business process. Fundamentally, what we’re after is the data.”
Additionally, Chris McMahan, CIO of Wireless Retail, a firm that retails cell phones and satellite technologies, said the technology has to be made easier for the mass market to use.
“The connectivity of a PDA to the Internet is still not seamless,” he said. “For mass adoption, it has to be seamless.”
Wireless Retail uses the technology to push a snapshot of each day’s business (gathered through point-of-sale systems) to company executives and managers as a text message.
Meanwhile, firms implementing wireless technology to give mobile employees access to the back-end have to carefully balance ease-of-use for employees with security, McMahan said.
McMahan said VPN is probably the best way at the moment to secure wireless transactions, but the problem is that firms must balance making it easy for non-technical employees to connect via VPN against making it too easy for outsiders to access back-end systems.
You're reading Cios Still Concerned By Wireless Standards, Security
Ankr Successfully Patched The Security Vulnerability Exploited By Hackers
Ankr protocol has swiftly patched the security vulnerability exploited by hackers earlier this week
Ankr, a decentralized finance (DeFi) protocol also popularly known as the first ‘node-as-a-service’ platform, has announced that it has restored security after suffering a hack on Dec 1st. The attackers stole an estimated $5 million worth of BNB across liquidity pools in various DEXes.
In the announcement, Ankr states that it has taken the necessary steps to compensate liquidity providers that were affected by the hack. It will purchase $5 million worth of BNB to use in paying out the compensation.
“Thanks to the fast actions from the Ankr team and various protocols, we were able to minimize any damage done extremely quickly. Hacks and exploits from bad actors like this are an unfortunate possibility in Web3, even with every attention to detail in security processes; but we were well prepared. Unlike previous events in space this year, we are doing the right thing by our community and ensuring that this is taken care of immediately with lost funds restored”, says Chandler Song, Co-Founder & CEO, Ankr.
What happened?It was hacked in the early hours of Friday, with the attacker leveraging the smart contract for the aBNBc token that allowed them to create an infinite amount of this token. This token represents a staked version of Binance’s BNB token that earns rewards on Ankr.
The aBNBb smart contract was safe from third-party minting prior to the attack, however, the attacker was able to obtain access to the deployer key. The attacker then uploaded a new aBNBb contract that included an extra method to mint without authorization checks. The attacker minted an excess of aBNBb out of thin air and rapidly moved to swap it out for other tokens on decentralized exchanges.
The address 0xf3a used the infinite mint bug in Ankr’s contract code to mint a total of 60 trillion aBNBc across 6 different transactions. The attacker was able to swap some for the stablecoin USDC and began moving them off of the Binance Smart Chain and onto Ethereum before the transactions were flagged. The Ankr team confirmed that it had been robbed of roughly $5 million in BNB. It also announced a proposal to make affected users whole by reissuing a new token called ankrBNB which would be distributed to pre-hack aBNBc holders.
What are the next steps for Ankr and Its users?The Ankr team says it quickly identified the vulnerability and began flagging the attackers’ attempts to liquidate the assets via various exchanges, an action which it says helped limit damages to $5 million. No other liquid staking tokens or Ankr products were affected, while Ankr’s validators, RPC API, and AppChain services also continued to operate without any disruptions during the mishap.
Despite having fixed the security issues, Ankr is still taking more steps to ensure more robust and complete security. The protocol is discontinuing the current smart contracts of the aBNBc token and its sister token aBNBb.
New tokens will be minted and airdropped to all users of the current tokens. The announcement states that Ankr will use a snapshot to airdrop the newly released tokens to all valid aBNBc holders.
Tech Cios Getting Deeper Into Product Development
In the last few years, the scope of CIOs has widened, and they have now been connected to the external environment and business. Let’s see how CIOs are working beyond their scope and in what segment they are going deeper!
Like every company, the main objective of an IT company is to offer a product to its client. Now as IT companies mostly make digital products often called services or software. Even in the case of physical products, companies and customers are both going digital for purchase and sale. As mentioned, the role of the CIO was to manage the digital infrastructure, systems, and processes; they are now responsible also for E-commerce, Online Services, Delivery of digital products, digital marketing, and digital products for employees.
CIO, or Chief Information Officer, is one of the key positions in an IT company. A CIO is responsible for managing the internal systems, technology, work environment, and infrastructure of the IT company, ensuring smooth and efficient operations. The role of the CIO is very important to fulfilling business goals and is considered one of the highest positions in an IT company.
Tech CIOs & Product DevelopmentThe connection or relation between CIO and product development went deeper after the outbreak of the pandemic. The pandemic has spread remote work culture and has brought digital transformation everywhere and in every segment. Now CIOs have to work closely on product development to manage the company’s remote operations, ensuring high work efficiency and productivity of employees and further taking care of online sales and distribution.
Why Do Tech CIOs Need to Get Deeper into Product Development?Though the pandemic is almost over, remote work culture is still there, and online sales and delivery will not go anytime soon. Now Tech CIOs are going deeper into product development, ensuring smooth, fast, and agile operations. It improves the overall working environment and efficiency, ensuring highly innovative and superior-quality products for customers. Secondly, modern tools and products for internal infrastructure improve employees’ productivity and create a feeling of satisfaction among them. It will also give senior authorities more grip over data, statistics, market, and internal work culture giving rapid progress to the business.
How Do CIOs affect Product Development?CIO is a very high position, and the person appointed to this role has years of industry experience. They have already passed the product development cycle or have been a part of the product development team in their early career stages. They have a specific skill set that can be used easily for product development. Also, as they are perfect in managing processes, they can prove highly effective in-service products like customer service, recruitment, automation, and cloud or data centers. All these segments are a part of IT infrastructure, and nothing can match CIO when it comes to IT infrastructure management. Let’s see in detail about CIO’s effective part in product development.
Customer ServiceDespite telecalling or on-site, customer service has gone completely digital, and CIOs can revolutionize this segment. As Tech CIOs are highly involved in employee satisfaction and query management, they can help exceptionally in developing new mechanisms and products for the best customer service. Also, customer service is a part of internal IT infrastructure, so Tech CIOs have a direct touch with the whole process, including customer feedback and other consumer data.
RecruitmentThough Tech CIOs didn’t have any direct connection with the recruitment process, they can make it faster, smoother, and more effective with product development. A lot of HR automation software are there in the market, and these are developed taking extensive help from Tech CIOs. Technical training software products and infrastructure development also come under Tech CIOs. The recruitment process is incomplete and worthless without proper training infrastructure and tools. It also helps in retention.
AutomationThese are just a few; Tech CIOs’ scope is continuously widening.
How Do Tech CIOs Directly Impact Company Revenue?There is no doubt in the fact that Tech CIOs play an important role in fulfilling a company’s financial goals and other objectives, but they don’t have a direct impact on revenue. Let’s see how Tech CIOs, through product development, can directly impact an IT company’s revenue and profits. So, the answer to this question is offering internal infrastructure and business IT solutions to other small companies.
ConclusionNot only Tech CIOs, but if you are working in an IT company and you have a great technical skill set, your scope can’t be put into a boundary. And when it comes to high positions like CIOs, their scope will obviously widen continuously. It not only benefits the organization in terms of profits, revenue, popularity, and sales but also benefits employees and the CIO itself in many ways. So, it is 100% correct and absolutely fine that Tech CIOs are going Deeper into product development, especially after remote and hybrid culture has spread widely everywhere. It is the need of the hour, and it is a good practice to involve more and more experienced technical personalities in product development, ensuring great growth and an impeccable working environment.
What You Need To Know About Future Web Standards
While it started as an experiment forty years ago, the Internet has become a very important part of our lives. Think about it, think about how much influence it has on areas like education, business, commerce, science and technology. To cope with the traffic demand and other aspects like speed and security, many new Web standards and protocols have been added and upgraded over time. In fact, many of such protocols and technologies are being framed and deployed as we speak. Last week, the inventor of the Web, Tim Berners-Lee, spoke about the changes the Web has seen in recent times. Let’s take a look at what new things we are seeing on the Internet right now and the new web standards we’ll be presented with in the near future.
HTTP2 and SPDYWhile the majority of the browsers have provided support for SPDY in their latest versions, sadly that is not enough for enabling this feature to work. To load a web-page faster, the website needs to resonate the same tech. Popular sites such as Google, Facebook and Twitter have already enabled this capability, but a vast majority of the sites have yet to make the switch. Later this year we’ll see the implementation and deployment of HTTP2.
WebRTCThe web browser is getting smarter every day. Not only is it becoming more secure and stable but behind the curtains it is quietly implementing some homegrown tools to replace proprietary tools that are install separately and are required. One such homegrown feature is Web Real Time Communication (WebRTC). This allows users to make video conversations without having to use a VoIP service such as Skype. Everything required is built in to the browser. Chrome and Firefox already support WebRTC. You can head over to the WebRTC Demo page to try out this feature.
SRCSETPeople use thousands of devices to access the Web. One person could use the iPad Mini, while the other may fancy Nokia Asha to get to the labyrinth of the Interweb. Some of these devices sport high resolution screens, whereas many of them don’t. The challenge here is to provide the appropriate image resolution to users. So how do we do that?
The answer is Source Set (SRCSET). It is an extension of the HTML5 standard, which allows Web designers to set up various versions of the same image file. So in accordance with the kind of device you are using, the website will find the right image resolution for you. Although it is yet to go mainstream, as of now, this is the one of the prominent ways to overcome this issue.
Responsive Web DesignMuch like SRCSET, Responsive Web Design is something that many Web designers have started deploying on their websites. There could be any number of devices consumers may use for accessing the website. Hence making the web-pages pan out well regardless of the screen size it is being viewed on is important. Ethan Marcotte had described it quite succinctly. Today, many websites, including Make Tech Easier, have deployed techniques like fluid grids, flexible images, and media queries to make the website adjust on any size of screen.
HTML5 and CSS3HTML5 already has a fair amount of traction. The web-programming language is responsible for the creation and appearance of a web page. The new version allows publishers to embed video and audio content on a web page without requiring any third-party tools like Silverlight and Flash. In addition, it can hold the location-based information too. It also provides support to offline access of web apps. This feature has already gotten the approval but is awaiting W3C’s recommendation.
After over a decade, the third version of CSS finally rolled out. The biggest difference between CSS3 and its previous versions is the separation of modules. In the previous versions, everything was to be written in the same document, whereas CSS3 introduced separate modules, with each having specific capability.
IPv6When the Internet was being framed, creators assigned it with 4.3 billion addresses – basically that many termination points through which devices were to connect onto the Web. But soon, as more mobile devices and computers started popping up, the 4.3 billion addresses that seemed like they will never be fully utilized were found insufficient to meet the current needs. The new version IPv6, which has already been adopted by several popular websites such as Google and Facebook, offers 340 “trillion trillion trillion” addresses. It’s safe to assume that even if all the planets of our solar system hopped on Internet connection from Earth, we will still have enough of it left.
Native ClientsAs all our computational needs are moving towards the cloud, our web browsers are being laced with more power. Thanks to Google and Microsoft, we have several native and portable web apps that can be run on the browser itself. Google Drive and Office Online are two great examples. Until a few months ago, these native apps couldn’t have been made to run on Android and other mobile devices, but recent amendments from Google show support for non-Intel processor devices.
Where are we headed?Many of the aforementioned web-standards haven’t gone mainstream yet. It is a continuous process, and the adoption takes a fair amount of time. Every day new things are being added to it, and the older not-so-optimized codes are weeded out. Many research organizations are working on building new protocols and enhancing the existing ones. The Web as we know it is changing. To keep up with it, our web browsers are picking up new technologies as well. One very assuring thing to come out of this is the Internet is getting better.
Subscribe to our newsletter!
Our latest tutorials delivered straight to your inbox
Sign up for all newsletters.
By signing up, you agree to our Privacy Policy and European users agree to the data transfer policy. We will not share your data and you can unsubscribe at any time.
Openvms: An Old Dog Still Doing New Tricks
Thought by many to be long since dead and buried, the OpenVMS operating system persists inside many enterprises.
OpenVMS continues to host critical applications, and in some areas such as disaster recovery, it is even enjoying a renaissance.
Why?
Despite an avalanche of hype about unsurpassed availability, fault-tolerance and security capabilities in UNIX, Linux and even Windows Server 2003, the OpenVMS operating system is leaving them in the dust in test after test. On top of that, real world examples abound of this unfashionable operating system standing up to the most rigorous disaster scenarios.
One online brokerage, for example, had a full-blown outage right before the start of the trading day. A brand-new security guard heard an alarm emanating from a UPS device and panicked. He hit the emergency power-off button, which took down the whole site. Fortunately, the brokerage had a disaster-tolerant OpenVMS cluster and a second data center 130 miles away with a full complement of servers and complete backup of stored data.
”The company operations continued without a glitch,” says Keith Parris, a disaster recovery specialist at Hewlett-Packard Co. ”They ran through stock market trading that entire day on a single site; powered the first site back up after trading hours were over, and started the data re-synchronization operations required to restore the protection of cross-site data redundancy once again.”
A steady diet of similar stories is convincing Fortune 500 companies to either look again at OpenVMS or postpone their plans to phase out this ”legacy” system.
Surprisingly, the stats of this old OS are impressive.
According to Ken Farmer of chúng tôi the operating system boasts 10 million users worldwide and hundreds of thousands of installations. It also shows annual growth rates of 18 percent over the last few years, and cluster uptimes surpassing the five-year mark. In terms of performance, OpenVMS claims 3,000 simultaneous active users; almost 2 million database transactions per minute (with Oracle); up to 96 cluster nodes (over 3000 processors), and a full cluster capability up to 800 kilometers.
”OpenVMS has moved almost seamlessly from VAX to AlphaServer system and now to HP Integrity Servers,” says Farmer. ”It is bulletproof, genuinely 24/7, disaster tolerant, remarkably scalable, rock solidly stable and virtually unhackable.”
The unhackable claim was validated at the DefCon 9 Hacker Conference where OpenVMS did so well they never invited it back. It beat out NT, XP, Solaris and Linux, and then was graded as unhackable by the best hackers in the business.
Surprisingly, this new-found fame is being championed by relatively few vendors. On the hardware side, Parris says HP offers business continuity products and services that begin with assessing an enterprise’s needs and objectives, and run all the way to full-service data centers and partnerships with niche companies to serve target markets.
International Securities Exchange (ISE) is an HP OpenVMS customer that only adopted it a couple of years ago. It uses HP AlphaServer systems running in an OpenVMS multi-site cluster environment at its New York City headquarters, along with an HP StorageWorks SAN.
”OpenVMS is a proven product that’s been battle tested in the field,” says Danny Friel, CIO at ISE. ”That’s why we were extremely confident in building the technology architecture of the ISE on OpenVMS AlphaServer systems.”
ISE boasts the fastest trading speeds in the industry — less than 0.2 seconds in the New York area. It also has the ability to recover quickly from any failure as it has no single point of failure.
On the software side, a few companies are doing very well servicing OpenVMS clients. Executive Software continues to offer several OpenVMS utilities, such as Diskeeper for OpenVMS, I/O Express, Frag Guard and Filemaster to improve OS performance.
”Some of our Windows customers think we recently brought out an OpenVMS version of Diskeeper, but in actual fact, we built the company on Diskeeper for OpenVMS about two decades ago,” says Justin Robertson, OpenVMS sales manager at Executive Software. ”We are seeing steady sales of new licenses of our OpenVMS products.”
The reason so many big companies are adopting or sticking firmly to OpenVMS is all about the cost of downtime. The bigger you are, the more money you make. And the more critical a few minutes of downtime become, the easier it is to justify a high-end system like OpenVMS.
After all, the perils of a data center crash are horrible indeed. According to the U.S. National Archives and Records Administration, 93 percent of companies that lost their data centers for at least 10 days filed for bankruptcy within a year. Half didn’t even wait that long and filed immediately.
”OpenVMS is probably the best designed and most robust general purpose operating system in existence,” says Colin Butcher, an analyst with consulting group XDelta Ltd. ”There are quite a few complete systems out there with uninterrupted service uptimes in excess of 15 years.”
A Complete Guide To Cloud Security Testing: Importance, Cloud Security…
Cloud Security Testing is the process of assessing and mitigating the security risks associated with cloud computing. Cloud security testing is profoundly important because it introduces new risks that need to be assessed and managed accordingly. In this article, we will look at different cloud security testing Techniques, the benefits of cloud security testing, different approaches to cloud security testing, and the most common Cloud Security threats. We will also discuss cloud security testing best practices.
What is Cloud Security Testing?Cloud security testing is the process where security risks associated with cloud computing get assessed and mitigated. cloud security testing helps organizations to protect their data, applications, and infrastructure from unauthorized access, use, disclosure, modification, or destruction.
Why is Cloud Security Testing important?Cloud computing introduces new risks that need to be assessed and managed. In a traditional on-premise environment, an organization has full control over its data center infrastructure and can implement security controls to mitigate risks. However, in a Cloud environment, the organization does not have direct control over the physical infrastructure or the platform on which its applications are running. Testing cloud security allows businesses to find and address any potential security vulnerabilities in their cloud system.
Also read:
10 Top Android Apps For Personal Finances
Cloud Security Testing TechniquesThere are a number of cloud security testing techniques that can be used to assess the security of cloud applications and infrastructure. We can classify these techniques into the following categories —
Reconnaissance
This is the initial stage of cloud security testing, during which all essential information on the target cloud environment is gathered and investigated using a set of processes. A wide variety of networks are examined along with the presence of live hosts. It is carried out using technologies like NetcatPreserve and ping in a number of ways utilizing methodological approaches such as these.
Vulnerability Assessment
This involves scanning for vulnerabilities in the Cloud application or infrastructure and reporting on them. It can be conducted manually or using automated tools.
Penetration Testing
This entails attempting to exploit security flaws in a Cloud application or system in order to access private information or systems. These tests can either be done manually or using automated tools.
Reporting
Cloud Security Testing Benefits
Cloud security testing is a service that enterprises may use to evaluate the security of their cloud apps and infrastructure and to ensure that they are in compliance with security standards. Cloud security testing also helps organizations to improve their overall security posture by identifying weaknesses in their systems and implementing controls to mitigate these risks.
Different Approaches to perform Cloud Security TestingThere are three different approaches that can be used to perform cloud security testing:
Black-Box Testing: This approach involves testing the Cloud application or infrastructure without having any prior knowledge of its internal structure. Black-box testing is typically used to assess the security of external-facing applications and services.
White-Box Testing: This approach involves testing the Cloud application or infrastructure with full knowledge of its internal structure. White-box testing is typically used to assess the security of internally facing applications and services.
Gray-Box Testing: This approach involves testing the Cloud application or infrastructure with partial knowledge of its internal structure. Gray-box testing is typically used to assess the security of applications and services that are not fully accessible.
Most Common Cloud Security ThreatsThe most common Cloud Security threats include:
Data breaches: This occurs when unauthorized users gain access to sensitive data stored in the Cloud. Data breaches can occur due to a variety of reasons, including weak passwords, unsecured data transmission, and poor security controls.
Denial of service attacks: This occurs when an attacker attempts to make an application or service unavailable by flooding it with requests. Denial of service attacks can cause significant damage to an organization, resulting in loss of productivity and revenue.
Malware: A software with the capability of damaging or disabling computers, Malware can be used for achieving targets such as stealing sensitive data, destroying information, and disrupting operations.
Cloud Security Testing Best PracticesThere are a number of best practices that should be followed when conducting cloud security testing:
Identify your cloud usage state and the associated risks: The first step is to identify how your organization is using the Cloud and what risks are associated with this usage. Knowing how long and how much you need to invest in your testing will help you make an informed decision.
Develop a comprehensive testing plan: A comprehensive testing plan should be developed that takes into account the specific needs of your organization. The plan should include all aspects of cloud security testing, from assessments to penetration testing.
Implement security controls: When vulnerabilities have been discovered, security measures should be put in place to minimize the dangers. These controls can include technical measures, such as firewalls and intrusion detection systems, or organizational measures, such as user training and policy development.
Monitor Cloud activity: Cloud activity should be monitored on an ongoing basis to ensure that all security controls are effective. CloudTrail and CloudWatch can both be used to provide logging.
Conclusion
Cloud security testing is a critical process that should be conducted regularly to ensure the security of Cloud applications and infrastructure. By following the best practices listed above, organizations can reduce their risk of exposure to common Cloud security threats.
Update the detailed information about Cios Still Concerned By Wireless Standards, Security on the Tai-facebook.edu.vn website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!