5 Steps To Protect Your Business From Cyber Crime

A Seattle company was recently broken into and a stash of old laptops was stolen. Just a typical everyday crime by typical everyday thieves. These laptops weren’t even being used by anyone in the company. The crime turned out to be anything but ordinary when those same thieves (cyber-criminals) used data from the laptops to obtain information and siphon money out of the company via fraudulent payroll transactions. On top of stealing money, they also managed to steal employee identities.

Another small company was hacked by another “company” that shared the same high-rise office building with them. Management only became aware of the theft once they started seeing unusual financial transactions in their bank accounts. Even then, they didn’t know if there was internal embezzlement or external cyber theft. It turned out to be cyber theft. The thief in this case drove a Mercedes and wore a Rolex watch . . . and looked like anyone else walking in and out of their building. Welcome to the age of cybercrime.

You Are Their Favorite Target
One of the biggest issues facing small businesses (SMBs) in the fight against cybercrime is the lack of a cyber-security plan. While 83% lack a formal plan, over 69% lack even an informal one. Half of small business owners believe that cybercrime will never affect them. In fact, small businesses are a cybercriminal’s favorite target! Why? Small businesses are not prepared and they make it easier on criminals.

The result? Cyber-attacks cost SMBs an average of $188,242 each incident, and nearly two-thirds of the businesses affected are out of business within 6 months (2011 Symantec/NCSA Study). A separate study by Verizon showed that over 80% of small business cybercrime victims were due to insufficient network security (wireless and password issues ranked highest). With insecure networks and no formal plan to combat them, we make it easy on the criminals.

How They Attack
The #1 money-generating technique these “bad guys” use is to infect your systems with malware so that whenever you (or your employees) visit a web site and enter a password (Facebook, bank, payroll, etc.) the malware programs harvest that data and send it off to the bad guys to do their evil stuff.

They can get to you through physical office break-ins, “wardriving” (compromising defenseless wireless networks), or e-mail phishing scams and harmful web sites. Cyber-criminals are relentless in their efforts, and no one is immune to their tricks

5 Steps To Protect Your Business

  1. Get Educated. Find out the risks and educate your staff.
  2. Do A Threat Assessment. Examine your firewall, anti-virus protection and anything connected to your network. What data is sensitive or subject to data-breach laws?
  3. Create A Cyber-Security Action Plan. Your plan should include both education and a “fire drill.”
  4. Monitor Consistently. Security is never a one-time activity. Monitoring 24/7 is critical.
  5. Re-Assess Regularly. New threats emerge all the time and are always changing. You can only win by staying ahead!

Bimodal Information Technology

08-BimodalITThe Information Technology (IT) environment at most organizations is one of constant innovation that brings in an influx of new information and data. Managing this, along with maintaining existing processes, requires a sound business strategy. Today, the Office of Information Technology (OIT) at the University of Texas at San Antonio is comprised of approximately 150 full-time employees. This staff consists of typical enterprise teams such as developers, server and network administrators, and other positions vital to the daily operations of an IT department. In the past, however, the culture of OIT was that of a traditional IT shop in that teams struggled to keep up with the day-to-day maintenance and operations.

Traditionally, maintaining existing services, upgrading existing applications, and implementing new business applications, along with other tasks, consumed the majority of our work, leaving employees no opportunity for creative development. Despite these many responsibilities, management still expected them to innovate and craft breakthrough ideas for technology. We realized this workload and structure was inefficient and perhaps even hindered innovative development. We needed a team dedicated to keeping UTSA at the forefront of technology while providing the university with a competitive, strategic advantage. As a result of our research, we discovered the concept of Bimodal IT and decided to implement it at UTSA.

Gartner defines Bimodal IT as “having two modes of IT, each designed to develop and deliver information and technology-intensive services in its own way.” Mode One is traditional, emphasizing safety and accuracy. Mode Two is non-sequential, emphasizing agility and speed. Simply put, Mode One means slow and cautious implementation, similar to that of marathon runners, while Mode Two is fast, similar to that of sprinters.

We needed a Mode Two team that was innovative and not impeded by daily maintenance and operational tasks. The original idea for the Mode Two team involved moving quickly toward implementing innovative solutions while keeping in mind that everything the team did may not prove successful. We adopted a “fail fast” strategy that freed us from spending months on a project only to discover that it was not going to work for one reason or another.

An essential practice with the Mode Two side provides a “proof of concept” that does not involve building projects to scale for the entire environment. Mode One team members implement and build the project to scale only after the “proof of concept” in Mode Two has proven successful. To move towards a successful implementation phase, it is crucial for the Mode Two team to provide thorough documentation for the Mode One group.

Forming the Two Modes
Ken Pierce, UTSA’s former Chief Information Officer (CIO) for IT and Vice Provost, fully supported the Bimodal IT concept, which helped pave the way for forming the new team. The first move was to begin assembling the new Mode Two team. OIT selected a director from the systems side of IT along with two technical staff members from the server and desktop support sides of the house. The staff members chosen for the team had a history of innovation within the organization. The director reported to the CIO, who was committed to the concept. With the newly formed team, the next step was to geographically separate the two teams by moving them into a separate building. This was done to ensure that the subject matter experts from the Mode Two team were not pulled back into daily operational tasks the Mode One team was handling.

The current role and responsibilities of the director of the newly formed Mode Two team then transferred to an existing director on the Mode One side of the department. The two directors created and agreed upon a transition plan that listed all of the outstanding Mode One projects and tasks. A corresponding graph was developed to indicate a percentage of time that the new Mode Two staff would be working on Mode One and Mode Two projects. As demonstrated in the graph, there was a four-month transition plan, and the Mode Two staff were not completely dedicated to Mode Two work until April 1, 2015. This graph allowed executive management and stakeholders the opportunity to visualize the progress of the transition plan.

08-TotalTeamAvailability

During the transitions, two professional developers were contracted and two part-time UTSA students were hired to act as assistant developers and provide help in other areas. We hired one student from the Electrical Engineering graduate program and another from the Computer Science program at UTSA. While neither student had years of development and systems architecture under his or her belt, both had experience developing in C++, Java, and other computer technologies. Since the goal was to find students who demonstrated the aptitude and desire to learn, these two students fit the team perfectly.

Our team was finally assembled with two systems professionals, two developers, and two part-time student workers. It was time to start putting our project list together and build out the planning and documentation processes.

From the very beginning of the team’s formation, we focused the projects on transparency. We built the project list into a SharePoint form that was open to all of campus so that the UTSA community could see what the Mode Two team was working on as well as enter requests for new projects. Anyone could enter a project request, but the team reserved the right to approve or disapprove anything from the list. The main criterion for the acceptance of a project request was that it had to provide a business and/or student benefit.

Instead of consuming large amounts of time building long, elaborate project charters and project plans, a process was created that resulted in a simple hypothesis form requiring three key pieces of information:

  • Project summary and problem statement
  • High-level requirements
  • Business or student benefit

A traditional Mode One trait that does not go away is the process of thorough documentation from inception to completion. Since these projects were eventually handed back over to the Mode One teams, it was imperative that build documents, lessons learned, and other information were delivered to the new team tasked with implementing the project.

Key Lessons Learned
The Bimodal concept includes two separate and fully functioning groups. It is essential that management does not forget the Mode One team or allow the Mode Two group to become known as an “elite team.” Management should remember that the Mode One team requires their attention and support because they are, after all, managing the heart of the IT infrastructure.

The leadership style of the Mode Two team should be democratic and focus on collaboration. Everything should be designed to function within a collaborative environment.

It may take time for the Mode Two team to change their mindset and ways of doing things. Managers can coach and remind them to start thinking in the new mindset. It is important to have the full support of senior management, especially the CIO’s when managers begin working with the Mode One Team.

Both teams (Mode One and Mode Two) offer two different ways of approaching projects and solving problems. When both teams are used properly and to their full potential, the entire IT department can benefit and produce impressive results.

Don’t assume your white-haired customers aren’t on social media

08-GreyHairSocialImagine it’s 1955 and your next door neighbor is watching Gunsmoke on his fancy, new TV. But you’re not really into this newfangled gadget with its grainy moving pictures. So, you’re listening to your radio – The Lucky Strike Program with Jack Benny is on.

The way you see it, TV is just a passing fad. Like the Hula Hoop, poodle skirts, or air travel. None of these things appeal to you, so you project your distaste onto the world at large and assume TV will fade away. “TV will be long gone soon, so why waste time and money on it?” you say to yourself.

That’s the thing with social media. Folks who aren’t into Facebook, Twitter, Instagram, etc., often assume that since they aren’t fans, these new communications channels are either just passing fads or only for the young. Boy, are they wrong to assume that.

Look, I write this as a 50-something guy who grew up loving radio, TV and – gasp! – newspapers printed on actual paper. If time and the unstoppable march of technological progress had stopped 30 or 40 years ago, I wouldn’t have minded. The Internet, Skype, smart phones, Pandora, podcasting, YouTube, Dropbox – these and so many other communications, information and entertainment technologies wouldn’t exist if the calendar still said today was May 18, 1985. People would still be reaching for a printed Yellow Pages book to find a local company. And life would go on anyway.

But the pages of the calendar keep flipping. And over the years, millions upon millions of people in my age range have embraced the new ways of communicating. Don’t make me whip out a string of statistics—you can see this with your own bespectacled eyes—but middle-aged and senior citizens are all over social media. Yes, Instagram skews younger and Facebook skews older. But it’s quite common to see a head full of white hair cocked over an iPhone, with thumbs flailing out a new social post.

Yet not a week goes by that I don’t hear a white haired client or colleague tell me that their customers are older and, therefore, they don’t see a need to engage their organization in social media. “Our customers/members/clients still like doing things the old way, with pen and paper,” they’ll say. Or, “They don’t really use stuff like ‘Tweeter’ so it would be a waste of time for us to set up an account. Our printed newsletter still works just fine.”

It’s hard to know where to begin in responding to those kinds of statements. But let’s take a shot.

First, no one should assume the people they need to reach aren’t social media users. A simple survey of customers/members/prospects/etc. could be a real eye opener on this one. We should also not assume that someone not using social media today will never become a social media maven. Hey, some folks just got their first cell phone within the past year, and I bet the vast majority of them will never give it up.

Second, if the company or organization in question currently revolves around older customers or members, that should be a red flag. (Or perhaps a white flag to match the dominant hair color.) In other words, the long-term viability of the organization may depend on recruiting younger members or customers. Do you think you’re more likely to attract them with an ad in the printed newspaper or with a highly targeted social media campaign? And does your absence from social media automatically lead those younger prospects to conclude that your organization is only for older folks?

I know. Letting go of the old ways is hard. But one way to evolve into the modern era is to do just that—evolve. Gradually. Take a few baby steps to creating your organization’s social media presence and get comfortable with the basics of how it all works. Start a couple of accounts. Facebook and Twitter are easy to use, and you’ll very quickly pick up on some great tools, like hashtags and sharing, for connecting with your audience.

Remember, effective marketing isn’t about what you like. It’s about what your customers like, where they are, and how they want to interact with you. Hint: More and more of them—of all ages—are on social media. You need to be there, too.

Backup and Disaster Recovery

08-backupRecently we had a technology group meeting regarding backup and disaster recovery. During the course of the presentation, the topic generated many points of discussion regarding the best ways to perform backups and what solutions seemed to work the best in the personal experience of those in attendance. Suffice it to say, the opinions were quite varied. This made me think: if there were that many different opinions on the topic among technology professionals, then it surely must be confusing for the everyday consumer or IT professional who has not had a lot of experience in this segment of the IT industry! So, I thought I’d try to shed some light on some of the types of solutions that are available and what users may expect from them.

The types of backup solutions that are typically seen in small businesses are:

  • Portable (USB) hard drives – These are typically hooked up to a server or main PC, and files are backed up to the portable drives either automatically via software or manually by a key person in the office or the IT professional. These portable drives typically are set up to save a certain number of backups onto each drive before the oldest backups are overwritten with the newest ones. Multiple portable drives can also be used to create a rotation to extend the number of backup sets that are kept and to allow drives to be taken offsite in the event of a catastrophe.
  • Cloud backups – These are a category in and of themselves, and there are countless solutions available. Some of the most common solutions are Carbonite, Mozy, and Crashplan. These backups are typically set to run on a daily basis in an automated fashion. Some offer unlimited data storage, and others have data limits set by the package that is being paid for on a monthly basis. These online backup solutions also typically have retention period settings that determine how many days, weeks, or months the data sets are backed up for (i.e., how far you can go back and recover data).
  • BDR (Backup & Disaster Recovery) Devices – These devices are in most situations the equivalent of having a backup server in your office. They provide both onsite backup and cloud backup as well. The backups performed on these devices are typically snapshots of your in-house server that are stored both locally and in multiple data centers in different locations to provide redundant protection from catastrophes by geographically locating your data in multiple secure locations. This solution can also give you the ability to bring up a virtual instance of your server in your cloud environment if there is damage to your physical property that houses your server, or the ability to bring up a virtual instance of your server on the local appliance in your physical building in the event that your server or servers fail to provide time to fix whatever the issue might be while minimizing downtime.

The above solutions are listed in the order of how well they generally work as well as the amounts they typically cost on an ongoing basis.

Some observations I have made over the years that people may not be aware of until they have had a situation where they have a need to restore data or recover from a disaster are as follows:

  • Even if you are using a cloud backup solution, you should always still also have some form of onsite backup solution. The reason for this, depending on the amount of data you are backing up to the cloud, is that even with a modern day high speed Internet connection, you could be talking about hours or days to download all of your data from the cloud backup provider, versus restoring much more quickly from a local backup. Some cloud backup companies will overnight you a hard drive (referred to as seeding) with your data on it for a fee, but this still involves a lot more downtime as compared to just restoring from the local backup and reserving the cloud backup for a situation where your local backup has failed or you have a disaster you are dealing with at your building site.
  • A backup is only as good as your last restore. Commonly, backups are being performed and emails or status reports are sent out saying that backups are being completed successfully. However, without doing a test restore of some of the files being backed up, there is no guarantee that you could actually restore your files (whether from a cloud backup or onsite).
  • Dropbox is not a backup solution. Dropbox is a good solution to synchronize files between multiple computers, devices, and other people, but it is not a backup solution. There is revision history where you can go back and bring back old versions of files, but to restore your entire data set or entire folders all at once from dropbox is not possible with the consumer version of dropbox (which is what most people use). Only the business version of dropbox allows for full folder restores. That being said, it is still recommended to perform your own backups of your dropbox or other file synchronization software data.
  • What type of backup are we running? If you are like a lot of companies, then you likely could still just be performing file level backups. The issue with only performing file level backups and not a “bare metal” restore type backup is that if your server crashes and you have to perform a restore from a file level backup, the server will typically have to be rebuilt from scratch before the backed up data can be restored to it. This is very time consuming and contributes to a lot of downtime versus having a “bare metal” restore backup solution, which is like taking a snapshot of your server so that it can be restored from absolutely nothing in a much faster manner.

The moral of the story is that there are a lot of backup solutions out there with different functionalities. It is best to know what solutions are out there, what your current backup solution is, and what kind of downtime you are looking at in the event of hardware failure or disaster.

If you would like a review of your current backup solution setup or to talk more in-depth about the available options, feel free to contact me.

End of Support: Windows Server 2003

05-CIOpanel

The End of Support for the Windows Server 2003 family of products is July 14, 2015. On that date, all support for Windows Server 2003, including R2, will cease. In light of that, now is the time to make your plans for your migration to Windows Server 2012 R2 Datacenter, Microsoft Azure, as well as Microsoft Cloud Services like Office 365.As a part of normal product lifecycles and to accommodate the shift towards modern technology and mobility, Microsoft will completely end support for Windows Server 2003 on July 14, 2015. Security patches and updates will no longer be available after that date. This Alert from the Department of Homeland Security indicates the seriousness of this event, and Microsoft encourages all businesses to carefully evaluate a migration plan:

“Research from IDC confirms that businesses should thoughtfully consider using this moment as a starting point for the shift toward modern technology. We think customers should take advantage of this deadline and see it as an opportunity not only to move forward to a newer version of Windows but also to modernize and prepare for the next generation of computers. Hybrid and public clouds are important components of next-generation IT.”
Source…

End of support means:

  • No updates
    37 critical updates were released in 2013 for Windows Server 2003/R2 under Extended Support. No updates will be developed or released after the end of support.
  • No compliance
    Lack of compliance with various standards and regulations can be devastating. This may include various regulatory and industry standards for which compliance can no longer be achieved. For example, lack of compliance with the Payment Card Industry (PCI) Data Security Standards might mean companies such as Visa and MasterCard may no longer do business with you. Or, the new cost of doing business may include paying catastrophic penalties or astronomically high transaction fees.
  • No safe haven
    Both virtualized and physical instances of Windows Server 2003 are vulnerable and may not pass a compliance audit. Microsoft Small Business Server (SBS) 2003 servers are also affected. Staying put will likely cost more in the end. Maintenance costs for aging hardware will also increase. Added costs will be incurred for intrusion detection systems, more advanced firewalls, network segmentation, and so on—simply to isolate Windows Server 2003 servers.

Many applications will also cease to be supported, once the operating system they are running on is unsupported. This includes all Microsoft applications.

Consequently, now is the time to act. You must start planning migration now. Servers may still be running Windows Server 2003/R2 for a number of reasons. You can use these reasons as discussion points regarding:

  • Perceived challenges of upgrading applications
  • Presence of custom and legacy applications
  • Budget and resource constraints

Among other possible resources, you may take steps to safeguard your business and make migration a priority with these helpful links:

Leveraging 802.11ac Wireless for a University Deployment

05-CIOpanel

From submitting a quiz on their laptop, streaming the latest episode of a hit television program on their mobile phone, or using their tablet to participate in an in-class activity, college students rely on the wireless internet. These days, everything is utilizing the internet for collaboration and communication. Students are carrying several devices and expecting them to be constantly connected.

The explosion of personal wireless devices has created challenges for campus wireless networks around the country, and The University of Texas at San Antonio (UTSA) is not immune. UTSA, a large 4-year public institution, is now experiencing similar issues as the number of wireless network devices utilized by students has dramatically increased. Students, along with faculty and staff, are increasing and expanding their wireless use on campus, engaging in research for projects and streaming content for entertainment. An intense demand for instant access to information for academic and personal use requires solutions to remain innovative and up-to-date.

A Growing Mobile Culture

Wireless usage for student devices such as laptops, tablets, and cellphones has tripled over the past three years. In a growing mobile culture, improving wireless performance is a task that benefits the entire university community. Having these resources available is important for universities competing for students, financial resources, and recognition as top tier institutions. In order to deliver exceptional technology services for the community, UTSA researched a strategy for implementation of a new 802.11ac wireless solution.

Students had previously complained about poor wireless performance in high-user traffic areas (such as the library), as well as difficulty accessing resources over the internet due to the growing use of the network. With 750 802.11n access points and a maximum concurrent usage on the wireless network user range around 17,000, UTSA recognized that the existing wireless network architecture and hardware were no longer able to provide adequate support for students, faculty, and staff in areas of high user density. Advanced system features were required for the desired functionality.

As with many institutions of higher education, consideration of the allotted budget to avoid increasing tuition for UTSA students or decreasing services was essential to finding a solution that could be maintained by the university. In an effort to resolve these issues, the Office of Information Technology (OIT) at UTSA evaluated alternative wireless solutions from multiple wireless vendors to identify one that would provide superior performance without the need to exponentially scale the number of access points installed on campus.

UTSA’s Approach

UTSA’s approach to upgrading the wireless network was through an on-premise controller 802.11ac-based solution. While controller-based solutions have been the status quo, UTSA has found them to be superior to cloud-hosted wireless solutions. From the limited group defined, the list of vendors was narrowed to two (a cloud-based and a controller-based offering), and a proof of concept deployment was done in the area of highest density on campus – the John Peace Library (JPL). During these tests, each vendor solution was evaluated based upon performance and product features. Although performance data can be subjective, it is beneficial to have. Along with other facts, this information led to the selection of a vendor. A vendor with a clear technology roadmap is important in wireless solutions as the technology changes so quickly.

The use of a controller-based system was proven to provide UTSA with the most flexibility in design and performance. Aruba provides powerful system management services and performance tuning and reporting, which has given UTSA the tools it needs to create a robust wireless environment. This allows the university to continue to research in other areas to improve the student experience with technology resources at UTSA.

Simplified design of the system, enhanced ease of use, and advanced security features including firewall and IDS will help protect the network from viruses, malicious actors, and unauthorized access. UTSA has seen a vast improvement in client metrics and performance management with the new system.

With the new wireless solution in full operation, OIT has seen a reduction in complaints from students. The university doubled the access points available to students to 1,500. Increased bandwidth utilization demonstrates the improved performance of the wireless network throughout the UTSA campus. The systems enhance the ability to identify, filter, and manage user traffic to ensure better performance for network-delivered curriculum, and they ensure that the new wireless solution will continue to adapt and serve the university as it continues to transform on its journey to Tier One recognition.

IT Certifications

05-CIOpanel

In today’s business world, information has become the valued asset on which business decisions are made. Information Technology (IT) is crucial in business, especially in regard to customers. IT systems like a patient web portal for retrieving medical records or a customer-facing ordering website are vital to an organization. IT leaders and CIOs must ensure their systems are maintained, secured, and available to meet the customer and business needs of the organization. The critical systems must be supported by well-educated and trained staff with proven abilities like IT certifications.

Certifications

IT staff can demonstrate their skills and knowledge of critical systems by becoming certified in such areas as Microsoft, VMware, Cisco, EMC, NetApp, and many more. As shown in the list of companies, certifications mainly focus on becoming qualified on a company’s product (Microsoft = Microsoft Certified Systems Engineer) instead of a particular job. However, due to the popularity of certifications, many positions are focused on IT jobs for specific certifications. Certifications in IT usually require an exam, while other more extensive certifications will add simulation in which the student will perform administrative tasks associated with the relevant product.

Microsoft originally established engineering certifications, such as Microsoft Certified Systems Engineer, that were focused on specific platforms like server, desktop, and exchange (Aranda, 2006). This has since been replaced and expanded to multiple Microsoft certifications that focus in the areas of infrastructure, developer, and database. An example below shows a pictorial graph of Microsoft certifications:

05-ITcert-pic1

Another great certification program is from Cisco. The Cisco certification program focuses on networking systems technology for their gear. In 1993, the company established the Cisco Academies and soon after created benchmarked standards for network technicians (Aranda, 2007). Like other certifications, the company felt that other educational institutions were not qualified in preparing students for such specified exams. Cisco has many certification tracks, such as Routing and Switching, Content Networking, Unified Communications, Optical Networking, Network Management, Cisco Security, Cisco Unified Wireless, Network Infrastructure, and Cisco Storage Networking (Aranda, 2007).

IT security is very big today due to the recent high profile data losses of Target, Sony, and Home Depot. Therefore, there is a great need to have security personnel who are certified. In 1989, a non-profit organization, (ISC)2, was established to create a global informational security certification process for professionals. Five years later, the first security professional credential was established, CISSP (isc2.org, n.d.). Security certifications have grown to over six tracks and many levels, as shown below:

05-ITcert-pic2

VMware Certification Program

One of the top certifications on the market today is from VMware. According to data provided by Foote Partners, “VMware cloud certifications are all pretty hot right now according to data” (VMware, n.d.). Those individuals who have pursued a career in virtualization have seen a significant increase in pay over the last year. As an example, pay for VCDX certification has increased 28.6 percent, and VCP-Cloud certification positions have shown an increase in pay of 12.5 percent (VMware, n.d.). While there hasn’t been significant growth over the last year, recipients of the VCAP-CID certification, according to Foote Partners data, are receiving 8-13 percent of base pay salary as a “skills pay” premium from employers (Hein 2014). VMware has many certification or solution tracks and different levels of certifications that range from Associate to Expert. Below is a list of the tracks and levels (VMware, n.d.):

05-ITcert-pic3

Besides pay, VMware certification has several benefits such as (VMware, n.d.):

  • Recognition of your technical knowledge and skills
  • Greater opportunities for career advancement
  • Complimentary VMware Workstation license (for new VCP5-DCV certifications)
  • Press release support (for VCDX level certifications)
  • Bio featured in the VCDX Directory (for VCDX level certifications)

05-ITcert-pic4

Since information is such a valued asset on which business decisions are made, IT is crucial for businesses to become industry leaders. Business and IT leaders must ensure their systems are maintained, secured, and available to meet ever increasing customer and business demands. One of the best ways to meet those demands is to employ well-educated and well-trained IT staff with appropriate IT certifications from companies like Microsoft, Cisco, (ISC)², and VMware.

REFERENCES

Aranda, N. (2006). The History of Microsoft Certifications – Now and Then. Retrieved from…

Aranda, N. (2007). A Brief History of Cisco Certification Training. Retrieved from…

Hein, R. (2014). 2014’s Hottest IT Certification. Retrieved from…

isc2.org (n.d.). History of (ISC)². Retrieved from…

VMware (n.d.). Industry Leading Certification Programs To Demonstrate Your Expertise. Retrieved from…

Wyrostek, W. (). Top 10 Problems with IT Certification. Retrieved from…

The Takedown Boomerang

05-CIOpanel

So you’re happily shopping on Amazon.com (not during work hours, of course), and all of a sudden you come across an e-book that looks awfully familiar. You take a closer look…hmmm. “Wait a minute,” you mutter, “that can’t be right. That’s the book that I wrote! It’s only supposed to be available on my site! And my price is much higher than Amazon’s!” You check your site, and – sure enough – traffic is down, and you’re losing sales.

You call your trusty IP lawyer, who sends a takedown letter to Amazon.com. Amazon.com removes the offending e-book, no counter-notice is issued, and you soon see an uptick in e-book sales. That’s the way it’s supposed to work, right? Pretty easy solution – no lawsuit, no long process?

I won’t address some of the intricacies of how the process can play out, but yes, that’s often how it does work under Section 512 of the Digital Millenium Copyright Act (DMCA 512), and it has worked fairly well for the last fifteen or so years. Google gets a staggering number of takedown requests – so far over 34 million requests in the past month alone.

However, the simplicity of the process has resulted in abuse, as well. DMCA 512 is not an all-purpose tool for shutting down whatever offends you online. And if you use it improperly, you could face a real publicity nightmare. Many sites now contain some version of a hall of shame – places where tales of DMCA overreach can live forever.

Don’t assume that the target of your takedown notice cannot or will not fight back. Organizations such as the Electronic Frontier Foundation (EFF), Digital Media Law Project, Chilling Effects Clearinghouse, and others stand ready to help targets push back on overreach. Many popular blogging platforms, such as WordPress.com, have also taken steps to protect their users against abuse.

When Diebold tried to use the DMCA to force students to take down Diebold’s leaked internal emails about voting machine flaws, the EFF stepped in to help – and won. In a turnabout, Diebold had to pay damages for filing frivolous takedown notices. That’s right – the DMCA allows targets to turn the tables if you send a false notice.

Online criticism is a fact of online life. If you’re the target, the DMCA may not be your best option for dealing with it. You may end up amplifying the criticism. When an organization called Straight Pride U.K. tried to squelch an apparently unfavorable email interview posted by a student journalist Oliver Hotham by using the DMCA to force takedown of his article, Automattic (which runs WordPress.com) filed a lawsuit in defense of the journalist, and won. That DMCA notice set off a wave of articles and a lawsuit – which had the unintended effect of broadcasting the organization’s views.

Bottom line: think through the possible ramifications before firing off a DMCA notice. The DMCA is a powerful tool when used properly, but if you misuse it, your efforts might boomerang on you with unintended consequences.

CIO Panel: Harnessing Digital – Mobile/Cloud/Data

05-CIOpanelOn Thursday, April 9th, I attended the 2015 San Antonio North Chamber CIO Panel, presented in conjunction with InnoTech. The sellout event at the San Antonio Convention Center was attended by over 300 representatives from San Antonio’s leading business and government organizations. After opening remarks by Gary Britton, New Horizons Learning Centers and San Antonio North Chamber Technology Chair, the session turned to recognizing the winners of the annual and lifetime technology leadership awards. Bill Phillips, Senior VP/CIO of the University Health System received the IT Executive of the Year. His accomplishments included leadership in major facilities and technology upgrades. David Monroe was given the first Lifetime Achievement Award for his many accomplishments starting with his leading role at Datapoint, San Antonio’s first breakout technology company.

In keeping with the theme Harnessing Digital – Mobile/Cloud/Data, Todd Chudd, Practice Director, Mobile & Modern Web at Randstad, presented his thoughts on how success can be achieved in the coming digital environment. As context for his comments, he pointed out that the typical American now spends 10% of their time using some mobile device. In his view, the successful future enterprise must continuously measure, adapt, and change again. For retailers, he felt they must move from the customer loyalty program as a source of data for macro marketing and sales strategy to one that creates the basis for one-on-one customer relationships.

CIO Panel Discussion

After that presentation, the session shifted to its main business, the CIO Panel Discussion. The panel included four of San Antonio’s most prominent technology leaders: Chris Cox, USAA, Head of Digital Delivery; Apollo Gonzalez, Catapult Systems, Chief Technology Officer; Greg Sarich, CPS Energy, Senior Vice President Enterprise Support & CIO; and Doug Skiba, Frost Bank, Executive Vice President IT Architecture & Strategy.

The first topic tackled by the panel was mobile strategy and measurement. All of the panelists advocated a customer driven strategy. Highlights on measurement included Frost Bank’s reliance on Google Play and Apple’s App Store customer ratings, CPS Energy’s goal of shortening walk-up lines with particular attention paid to the walk-up customer who spends the entire wait on a digital device, and USAA’s focus on adoption and utilization.

Next up was a discussion of the impact of 24/7 mobile access on support. Again the group generally agreed that support must match customer access, including adding support modes for chat, voice, and video. Catapult sees the explosive jump in user access driving a move to the cloud to assure availability and to reduce the risk of excessive down time. CPS sees the access to multiple mobile applications driving an ever increasing demand for service and new capabilities.

The third topic for discussion was the impact of the “Internet of Things.” Of note was USAA’s moving from an episodic relationship with customers to a continuous relationship based on input provided by the customers’ “things.” CPS is particularly impacted as meters move from being the trigger for billing read monthly to an energy management device for both the customer and the company read every 15 minutes. Catapult sees the major impact being the mountains of data generated by everything from vehicles, to thermostats, to refrigeration units.

The final topic was the impact and application of the “cloud.” As a group, the panelists were relatively conservative on the use of public or third party cloud applications and storage because of security concerns. Generally, common administrative and support systems like email were considered the best candidates for the cloud. The one exception to that thinking was CPS Energy’s use of a California-based provider that collects and processes all of their meter data.

My sense as the crowd exited the room was general agreement of time well spent and a value added event. I’ll certainly Save the Date for 2016!