A few weeks ago, security researchers at Keen security lab, China based Tencent’s security research arm used a vulnerability in WebKit, a widely used open source browser framework, to trick a Tesla Model S that connects with a malicious hotspot into downloading a malicious payload. Then, they used a second vulnerability in the version of Linux that Tesla uses, to gain full privileges to the head unit. Finally, they overwrote the firmware in a “gateway” that separates the head unit from the CAN bus – which can be used to control key car operations – to defeat the security mechanism therein that allows only a small set of whitelisted commands to be sent from the head unit to the driving systems. Ultimately, this allowed them to remotely activate the vehicle’s brakes.
It wasn’t too long ago that the worst that could come out of a vulnerability in a browser was an end user’s PC being compromised. Interestingly, Tesla’s fix for the problem – was to ship “code signing” as part of a firmware update so that further update to components on the vehicle’s CAN bus require a cryptographic key that only Tesla holds. ‘
This is the right approach to solving the problem, and one that Tesla CTO J.B. Straubel correctly says should become a standard in the auto industry. Interestingly, IBM Lotus Notes had code signing for updates in Release 1, shipped in 1989!
As cars get more connected and autonomous, the stakes on getting security right are high. Security for cars is catching up to security for other connected consumer devices with decades old attack vectors and decades old security mechanisms being brought back into play. Interesting times!
In the auditing and compliance world, the buzzwords right now are IFRS and SSAE. IFRS stands for Internaltional Financial Reporting Standard and SSAE stands for Statement on Standards for Attestation Engagement. They both gear towards standardizing the laws which is necessary looking at the current nature of Business. In my opinion, Globalization and Outsourcing are two major catalysts for standardization.
In the Technology world servers/infrastructure will soon become obselete. A lot of companies has already started using SaaS (Software as a service) and PaaS (Platform as a service) along with SOA (Service Oriented Architecture). The world of cloud computing is next in line. Now the question arises what does these all mean for compliance? Saas had a simple solution for compliance – SLAs (Service Level Agreements). But with cloud computing its totally a different story.
Let me start by giving a brief background on cloud computing –
According to Wikipedia, Cloud computing refers to the use of Internet (“cloud”) based computer technology for a variety of services. In layman’s terms, it means you don’t have your own processing / storage / both and use a provider like Amazon or Google for it. There are mainly four different categories where an enterprise can use cloud computing
- Storage and
- Raw Computing.
Cloud computing has its own benefits and risks. Benefits inlcude lower capital expenses, innovation, scalability and performance. Compliance is the biggest risk because you have servers anywhere in the world that will either process/store your information and give back the results. And it is so dynamic in nature that it is not processed/stored at one single location. Anywho, the benefits are ten fold using this technology and I am sure we will come up with a solution. In short, cloud compliance is going to be the next big thing.
I was reading the article on ZDNet on “How IT can save us from recession” and found it hilarious.
Ken McGee VP & Gartner Fellow said that IT innovation will be a key factor in helping the US get out of recession. IT departments should implement an action plan “Innovation of the Third Kind”.
The first kind of innovation is when IT people devise plans to meet technical needs that have been observed by IT staff, McGee said. “Infrastructure modernization is an example,” he said.
The second type of innovation is when IT introduces solutions to deal with business needs that have been identified by business people. These solutions only produce incremental change from an infrastructure perspective.
What is needed now is innovation of a third kind, said McGee. This comes about when IT people introduce projects to meet business needs that have been observed by IT staff, he said. The senior analyst said IT executives should start scheduling meetings with business unit managers and executives, review their IT portfolio and decide what is going to go ahead, what could be trimmed and which products are going to be cut. “Have business plan in place before getting the green light”, said McGee.
It is a very noble thought and a true statement if it was limited to “IT innovation will be a key factor in helping the US get out of recession” and without any explanations or types of innovations. I agree to the fact that US needs innovation and it is what has made it sustain the top position in the world. We are in dire need of research and innovation right now but it should not be just limited to the IT industry. Speaking of IT industry, it would be a good idea to go back and evaluate the projects and to revise the priority of IT projects to make sure that they are in-line with business objectives. IT can definitely help Business get through this difficult time but they should do that anyways (be it a good time or a difficult time). IT and Business are two sides of a coin. They should go hand in hand with each other. What Ken said about IT meeting with Business to find out ways to improve should be an internal process of every company. CIOs are a part of top-management from 1990s. It is not a new discovery but it is something that every company needs to understand and give it much needed importance. Secondly, thinking broadly of IT innovation and not linking it to any particular business but an innovation that can help solve/improve a general business problem/practice. For example: Virtualization. That is an innovation which solves general business problem and maximizes the output from the resources. It is said to reduce head counts in monitoring and administering computer systems but a recent survey from Gartner also mentions that the people affected are moved to new roles. The focus then changes from maintenance to improvement/expansion.
Morpheus: What is real? How do you define real? If you’re talking about what you can feel, what you can smell, what you can taste and see, then real is simply electrical signals interpreted by your brain.
In the computational world and in this age it is true that there is nothing called real. The world is dominated by Virtual – Virtual reality, virtual dating, virtual test-drive, virtual machines, etc. I was first introduced to the concept of virtualization in my Operating System class (Threads and Instances). Virtualization gained popularity with object oriented programming (encapsulation) and it was primarily used for designing applications. Recently it has gained momentum with Resource and Platform virtualization. And now the next big wave is using virtualization for business continuity. Underutilized servers, complex IT systems, increasing IT costs, loss of business due to IT disruptions are some of the burning issues in every company. Because virtualization separates applications from the physical layer and allows for sharing of resources in a multi-OS environment, it is easier to set up two sites which are active. This helps in better utilization of physical resources, reduced costs and no interruptions as there are two active sites. Virtualization thus is shifting the focus of business continuity from recovery to uninterruptible service.
As taught everyday in my IT Strategy course, every technological investment should have a business case and understanding its importance and impact after entering the industry, I am personally a big fan of business cases for any new implementation / technology. After the dot com boom, this is one thing that requires little effort on the IT management to build a business case. It has more quantitative results to show the value of the project. Consolidation through virtualization reduces administrator and operator costs. There are predictions that the number of people required for monitoring and administering computer systems in the data center will decline as much as 50% over the next two decades.
It is true in life but truer in technology that change is the only constant. Let’s take information management. The focus of information management has been constantly changing since the early days. It all started with the twin problems of capturing complete information and processing its faster and more accurately. Then came the era where efficient information storage and retrieval was an issue. Relational Databases, normalization and increase in storage capacity at the hardware level solved that problem. Bigger storage capacity and smaller physical size have now given birth to a new disorder which needs to be handled. Metaphorically, we now have big rooms, big binders which have covers and tabs. We have a good book-keeping method by which we can search the binder(s) efficiently but there is still a lack of context to all these. To use another metaphor, systems and applications have taken the place of ministers in the court of the king – you, the user. Their job is to provide you with information in a precise manner to support decision making. The only difference is that the ministers of the old could process all the information and present to the king only those pieces which were relevant in solving the problem being discussed. Today’s systems and applications can no doubt provide data faster and with greater accuracy but they lack context. Giving context to the data is an important challenge faced by decision makers today.