Posted on 14 Comments

Business-IT Strategic Alignment in Complex Multinational Corporations (MNCs) – Research Results

Effective management of IT required planning processes that create a high degree of business-IT alignment. Business-IT strategic alignment is among the main management concerns of information systems managers and corporate chief executives. The problem is that alignment become difficult to implement as companies strive to link business and technology in light of the internationalization of their businesses. The purpose of IT strategic alignment research is twofold: (a) identify the reasons why alignment gaps exist between business goals and IT strategies, and (b) find a fit between business objectives and IT plans by building an integrated framework to explicate their interactive values.

Purpose of the Study

The purpose of the present positivistic research study was to investigate business-IT strategic alignment in a multinational corporation by examining (a) the role of knowledge management processes in the relationship between contextual factors and alignment, and (b) the role of IT projects in the relationship between alignment and organizational performance and effectiveness.

This study used a field survey and structural equation modeling (SEM) techniques to analyze data collected through the stratified random sampling of 263 IT and business managers employed in the U.N. Secretariat.

Implications to Leaders in MNCs

This study had several managerial implications for the consideration of business executives and information systems managers and provided insights to researchers on the issues of business-IT strategic alignment in complex multinational organizations.

The results of this study have at least four implications to leaders in MNCs:

(a) the effects of top managers’ knowledge of IT on strategic business-IT alignment
(b) the importance of business-IT alignment to organizational performance and effectiveness
(c) the importance of internal context and nature of the organization to knowledge integration
(d) the role of senior management in knowledge management and strategic management of IT.

Recommendations for Leadership

A theoretical and practical perspective of business-IT strategic alignment in the U.N. Secretariat is provided. The business-IT strategic alignment implementation model for MNCs (mSAIM) is the model for application proposed as critical recommendations of this research study.

The mSAIM model is a five-stage process or a roadmap for business-IT strategic alignment. The mSAIM model draws upon the proposed business-IT strategic alignment model for MNCs (mSAM) which covers process and content perspectives of the interrelationship between business and IT for this category of organizations.

The Five-Stage Process of IT Strategic Alignment

The five-stage process of IT strategic alignment covers process and content perspectives of the interrelationship between business and IT in MNCs. The process perspective includes the following dimensions: (a) intellectual and social, (b) short- and long-term, (c) shared domain knowledge, and (d) enablers and inhibitors. The content perspective focused on the strategic orientation of business enterprises and the strategic orientation of the existing portfolio of information systems.

The five-stage process summarizes the steps for a successful IT strategic alignment in MNCs. The process has five stages, each of which is associated with one of the nine reasons explaining IT projects failure. The steps are: clarifying the strategic business orientation, developing leadership competencies of business and IT managers, sharing knowledge, strategically planning IT projects, and strategically managing IT and technological changes.

Information:

Don’t forget to join the Business-IT Strategic Alignment community and share your experience using LinkedIn.

Kindly please send me an invitation to connect to download the five-stage process of business-IT strategic alignment from the Slideshare presentations on my profile.

Coming Soon:

Two peer-reviewed articles and two books on Business-IT strategic alignment for complex MNCs will be published soon.

Posted on 78 Comments

Practical Applications of Total Quality Management (TQM) – Part I

Total Quality Management (TQM) is a philosophy of management that strives to make the best use of all available resources and opportunities through continuous improvement. TQM means achieving quality in terms of all functions of the enterprise. Many researchers attempted to analyze how IT and TQM can jointly add value to organizations and the purpose of this first post on TQM is to evaluate the practicality of TQM in an IT service. 

In this evaluation, a balance of the service management needs with the reality of bottom-line effectiveness is provided. The post also provides a list of critical success factors to consider in a change management initiative engaged by an IT service.

TQM in Practice

The essence of quality is to do it right the first time, and to satisfy customer requirements every time by involving everyone in the organization. The works of Crosby and his colleagues on the evolution of TQM cut across all pervasive philosophies of management. TQM has been a key business improvement strategy since the 1970s, as it has been deemed essential for improving efficiency and competitiveness. TQM aims to achieve an overall effectiveness which is higher than the individual outputs from the sub-systems such as design, planning, production, distribution, customer focus strategy, quality tools and employee involvement. This philosophy of management strives to make the best use of all available resources and opportunities through continuous improvement. 

As a management philosophy, TQM makes use of particular set of principles, practices, and techniques to expand business and profits and provides a bypass to enhanced productivity by avoiding rework, rejects, waste, customer complaints, and high cost. This can be achieved by emphasizing the organization’s commitment from data-driven, problem-solving approaches to quality accruing. 

The five basic pillars of TQM are: (a) top management commitment for quality enhancement, (b) customer centric advancements of processes and building a long-lasting trustworthy relationship between the organization and the customer, (c) relentless development by setting goals and deadlines, (d) benchmarking with several specific tools and quality-adding techniques, and (e) strengthening the employee base by concentrating at any stage of a process on quality, where customer satisfaction is stationed. Table 1 provides a summary of the key dimensions that constitute TQM.

Table 1: TQM key dimensions

TQM dimensions   Description
Top management leadership   Top management commitment is one of the major determinants of successful TQM implementation. Top management has to be the first in applying and stimulating the TQM approach, and they have to accept the maximum responsibility for the product and service offering. Top management also has to provide the necessary leadership to motivate all employees.
Customer relationships   The needs of customers and consumers and their satisfaction should always be in the mind of all employees. It is necessary to identify these needs and their level of satisfaction.
Supplier relationships   Quality is a more important factor than price in selecting suppliers. Long-term relationship with suppliers has to be established and the company has to collaborate with suppliers to help improve the quality of products/services.
Workforce management   Workforce management has to be guided by the principles of: training, empowerment of workers and teamwork. Adequate plans of personnel recruitment and training have to be implemented and workers need the necessary skills to participate in the improvement process.
Product design process   All departments have to participate in the design process and work together to achieve a design that satisfies the requirements of the customer, which should be according to the technical, technological and cost constraints of the company.
Process flow management   Housekeeping along the lines of the 5S concept. Statistical and non-statistical improvement instruments should be applied as appropriate. Processes need to be mistake proof. Self-inspection undertaken using clear work instructions. The process has to be maintained under statistical control.
Quality data and reporting   Quality information has to be readily available and the information should be part of the visible management system. Records about quality indicators have to be kept, including scrap, rework, and cost of quality.

 TQM and Change Management Initiative for IT Performance

A business firm achieves world-class status when it has successfully developed operational capabilities through TQM to support the entire company in gaining a sustained overall performance over its competitors. Although there is insufficient statistical evidence to conclude significant simple relationships between TQM and IT services quality performance, many studies investigated the notion that TQM practices provide approaches to improve the economic position in the service sectors in general. Both IT and TQM had, and will continue to have a significant impact on most organizations. I only regret the lack of empirical research on the relationship between the two and how they both relate to business performance.

Critical success factors for TQM implementation in an IT service are summarized in Table 2. The implementation of TQM in an IT service should facilitate the adoption of appropriate policies and procedures that enhance the eight categories of TQM.

Table 2: Critical success factors for TQM implementation

Categories Critical success factors
Leadership Top management leadership and commitment, supervisory leadership, organizational commitment.
Quality unit Role of quality unit, strategic quality management.
Empowerment Training, employees’ satisfaction, employees’ relations, teamwork structures for improvement, providing assurances for employees, education.
Supplier management Supplier quality management, supplier integration, external interface management, supplier partnerships.
Process management Product design, quality policy, quality improvement measurement systems, operating procedures, operational quality planning.
Quality of data Quality data reporting, quality information systems, technology utilization.
Customer satisfaction Customer satisfaction orientation, people and customer management.
Communication Communication of improvement information, cross functional communications to improve quality.
Posted on 22 Comments

Business-IT Strategic Alignment Models for Complex MNCs

My dear readers,

 I am pleased to inform you that I just completed a research study on Business-IT Strategic Alignment Models for Complex Multinational Corporations (MNCs). The purpose of this positivistic research study was to investigate business-IT alignment in an MNC by examining (a) the role of knowledge management processes in the relationship between contextual factors and alignment, and (b) the role of IT projects in the relationship between alignment and organizational performance and effectiveness.

The objective of this 4-year study was to provide a theoretical and practical perspective of business-IT strategic alignment in the U.N. Secretariat. The sample consisted of 166 IT managers and 97 business managers from 50 offices in the U.N. Secretariat. The study focused on two aspects of strategic IT planning within the U.N. Secretariat: (a) business-IT strategic alignment and (b) IT project planning. This study drew upon the strategic alignment model (SAM) and the typology of MNCs to propose and test an IT strategic alignment model for MNCs (or mSAM) using the U.N. Secretariat as a field study.

A theoretical and practical perspective of business-IT strategic alignment in the U.N. Secretariat is provided. The business-IT strategic alignment implementation model for MNCs (mSAIM) is the model for application proposed as critical recommendations of this research study.

 Please regularly visit my website www.nkoyock.net. On the front page of the site, you can read the background of the study, statement of the problem, purpose of the study, data analysis, main outcomes of the research study, research findings, implications to Leaders in the U.N. Secretariat and in MNCs, and the recommendations for leadership. More materials, such as articles and books, will be available very soon.

Posted on 190 Comments

Strategic Intrapreneurship for Machine Organizations?

Machine organizations are structures fine-tuned to run as integrated, regulated, and highly bureaucratic bodies. Machine organizations have the same basic characteristics: (a) their operating work is routine, (b) the greatest part of it is rather simple and repetitive, and (c) their work processes are highly standardized. In a machine organization, there is little room for intrapreneurship.

 Max Weber (1864-1920) elaborated the bureaucracy theory to establish a rational basis for the organization and management of large-scale organizations. For Weber, bureaucracy meant management by the office (German Büro) or position rather than by a person or patrimonial. Weber analyzed bureaucracy as the most logical and rational structure for large organizations. Bureaucracy was not rule-encumbered inefficiency as the term connote in modern parlance and characterized machine organizations.

 Many international organizations are best examples of machine organizations. Most of them were created after the World War II. Their agendas and focuses expanded tremendously in recent years. This vast expansion in mandates and responsibilities necessitate a change in organizational and management practices. These organizations can not afford to be a static one anymore and they need to evolve into agile bodies with rapid deployment capabilities and multidisciplinary experts capable of handling the wide range of global issues.

 Many scholars and practitioners argued that strategic intrapreneurship is the best strategy to rationalize a machine organization. Intrapreneurship can be defined by its content, and this includes dimensions based on the Schumpeterian innovation concept, a building block of entrepreneurship. The pursuit of creative solutions to challenges confronting the organization, including the development or enhancement of old and new products and services, markets, administrative techniques and technologies for performing organizational functions, as well as changes in strategy, organizing, and dealing with competitors, may be seen as innovations in the broadest sense. The increase of intrapreneurship could be a key component to the success of this form of organizations because they operate in rapidly changing industries.

 Intrapreneurs refer to intra-organizational revolutionaries or entrepreneurs within established organizations. Intrapreneurs try to challenge the status quo and fight to change the system from within. They are generally driven by their internal locus of control, reinventing companies, transforming them, pushing them up to new heights, sometimes with and most of the times without the top management support.

The management of intrapreneurship within an organization is complex for many reasons. The first reason is the nature of the organizations (machine organizations, bureaucracy, large and small organizations, etc.). Entrepreneurship is a context-dependent social process through which individuals and teams create wealth by bringing together unique packages of resources to exploit marketplace opportunities.

 The second reason is the creation of a viable intrapreneurial attitude. To create a viable intrapreneurial attitude, the firm must be sensitive to the nature of its rewards system. While it is true that many intrapreneurs are more challenge-oriented, that is, the true intrapreneurs is inspired by success achieved in the face of obstacles, there comes a time when even the most selfless intrapreneurs begins to ask, “What is in this for me?”. Entrepreneurship has been described at both the individual level (Mintzberg’s standpoint) as well as the organizational level with Miller and some other scholars. Mintzberg noted that a viable rewards system is only possible if the concept of entrepreneurship is associated to the individual level.

 The third reason is the paradox of corporations. Intrapreneurs set corporate innovative business models and reinvent organizations. The paradox is that intrapreneurs are not always welcomed in organizations.

 Various scholars proposed 10 gateways to intrapreneurship that can make a real difference in an organization’s ability to compete. These are: (a) a culture of work force empowerment, risk-taking, and action; (b) celebrating and rewarding ideas, progress, and results; (c) free-flowing customer information and internal communication; (d) management support and engagement at all levels; (e) ongoing encouragement and promotion of risk taking and new ideas; (f) developing processes for idea generation and advancement; (g) clearly defined organizational needs, vision, and direction; (h) developing better cooperation and teamwork; (i) providing resources to support new ideas; and (j) cross training and special arrangements. These gateways can be valuable to corporate executives who need to manage entrepreneurial style in their organization.

 The application of these gateways requires innovative leadership competencies. Organizations whose success depends on innovation require a leadership style totally different from the one typically used by most leaders. Whereas leaders of traditional organizations succeed on their ability to artfully manipulate their environment, innovation leadership emanates from manager’s creative initiatives, intellectual preeminence, and technical or unique expertise that is of value to each individual in the group and which translates to direct benefits for all.

Your thoughts?

Posted on 391 Comments

Redefining IT Performance and IT Effectiveness

Performance refers to the ability to acquire resources necessary for organizational survival.  Organizational performance results from a combination of industry or environmental conditions, the strategy that an organization’s decision makers choose, and the structure in place to support the strategy.  Performance is a proxy measure that indicates legitimacy by resource suppliers and perceived organizational effectiveness. Performance measurement consists of an assessment tool to measure effectiveness, provides information to managers for decision-making, and helps them to analyze organizational efficiency at the operative and strategic levels. 

 

Performance issues relate to the disciplines of general systems theory elaborated by Ludwig von Bertalanffy in the 1920s.  This theory included various disciplines such as behavioral science (sociology), economics (management accounting), information technology, mathematics (operations research), and organization theory.  These core disciplines for agency and management theories form a suitable umbrella for HR management, public administration, and management control systems. The literature on management philosophies provides an examination of these disciplines with an emphasis on corporate culture and power, Taylor’s scientific management, Mayo’s humanistic management, or quality management.

 

From a performance standpoint, three major components relate to management and agency theories: analysis, evaluation, and measures.  Several methods facilitate a performance analysis: expert systems, data mining, factor analysis, geographic information systems, ratio analysis, statistical regression, structural equation modeling, and productivity theory (data envelopment analysis, total factor productivity, and stochastic frontier analysis).  Whereas performance measures result from various frameworks such as the balanced scorecard and performance pyramid, performance evaluation includes strategic management issues that cover the alignment between incentive means of knowledge workers and corporate strategic goals and processes.

 

Early traditional frameworks of organizational performance such as Du Pont’s pyramid of financial measures (1920s) were a single all-encompassing approach. Du Pont’s framework for example, only focuses on financial performance. In contrast, emerging tools integrate the complexity and dynamic aspects of organizations by considering various dimensions of performance.  In this line of reasoning, performance measurement covers processes, system dynamics, and strategies that characterize business.   

 

Even though the overall performance of the information systems function seems to be difficult to conceptualize and measure, two approaches can be distinguished in research into the business value of IT investments: variance and process approaches. The variance approach focuses on the relationship between IT investments and organizational performance by considering financial measures such as lower costs, higher revenues, and improved market share.  The variance approach examines the “what” question: What is the relationship between IT investments and organizational performance? In contrast, the process approach focuses on the “how” question: How do IT investments improve organizational performance?

 

The process approach combines the returns on investments with process and organizational changes. The process approach analyzes the impact of IT on organizations from efficiency, effectiveness, and strategic IT alignment standpoints.  IT efficiency is the IS function highlighting the relationship between IT expenditures (or IS capabilities) and IT assets (or IS function outputs such as systems performance, information effectiveness, and services performance).  IS capabilities are inputs such as hardware, software, human skills, and management processes that serve to translate IT expenditures into IT assets.  Researchers use various metrics to assess IT efficiency: availability of systems and applications, number of help desk tickets, mean time between failure, and license usage.  These metrics comment on efficiency of systems, applications, and networks, unlike other performance variables that focus on engineering performance.

 

Prior scholars pointed out the limitation of IT efficiency measurements to assess IT effectiveness.  The influence of IT on organizations moves gradually from an efficiency production factor toward the maximization of the business value of IT investments (or IT effectiveness).  Enterprises generally invest on IT for two reasons: (a) to capture information to support corporate processes, and (b) to enable business change.  These scholars advised that the contribution of IT is to be specific (by supporting defined business processes) and generic (by enabling undefined business change).  They added that the measurement models of the IT business value should differ from performance models but close to capability models.

 

Within the context of strategic IT planning, some of the prior research attempted to investigate the linkages between IT investment projects and the associated business value using selected financial measures related to performance and productivity.  Some other studies attempted to measure the business impact of IT in organizations by market expansion, cost avoidance, customer value, efficiency, and profitability.  Some other research compared two analytical models (linear and nonlinear) and two conceptual (contingency-based and resource-centered) frameworks to assess the business value of IT using both financial objectives (expense and revenue) and perceived measures (firm’s perceived profitability).

 

Drawing upon the theoretical input-output model, Chang and King (2005) developed an instrument that explored the role of the IS function on business process effectiveness and organizational performance. Silvius (2006) proposed a multivariate value framework to assess the impact of IT on an organization. Yeniyurt (2003) proposed a performance measurement framework for global corporations drawing upon methods involving both financial and non-financial variables such as Skandia navigator, economic value added, and balanced scorecard. Yeniyurt’s non-financial variables for the organizational performance and effectiveness construct are customer satisfaction, innovation, internal processes, and organizational culture and climate.

 

The research on IT project planning process can be subdivided into strategic and operational perspectives. Some research on IT project planning explored the strategic aspects and the identification of projects that match with corporate objectives. Some other studies focused on the analysis and selection of a project from several capital expenditures’ alternatives (or capital budgeting of IT investments). The traditional capital budget methods are based on the calculation of the cash flows input and outputs. Seven traditional budgeting models are used to evaluate capital projects: (a) payback method, (b) return on investment, (c) cost-benefit ratio, (d) profitability index, (e) net present value, (f) economic value added, and (g) internal rate of return.

 

These traditional capital budget methods are limited to valuate IT projects because of (a) their inability to cope with risk, uncertainty, and flexibility, (b) they overlook the cost to train users, the learning curve to adapt to new technologies, and the socials subsystems costs and benefits of the IT projects, and (c) their inability to quantify intangible benefits such as improving knowledge, customer service, or decision making. These shortcomings are especially clear with IT investments done under conditions of uncertainty in today’s global economy which requires dynamic capabilities and strategic flexibility. The real option approach have been proposed as an alternative to the deterministic capital budget methodologies and the extension of the financial option theory to the options on real (non-financial) assets.

 

Some References:

 

Chang, J. C., & King, W. R. (2005). Measuring the performance of information systems: A functional scorecard. Journal of Management Information Systems, 22(1), 85-115.

Silvius, A. J. G. (2006). Does ROI Matter? Insights into the True Business Value of IT. Electronic Journal of Information Systems Evaluation, 9(2), 93-104.

Yeniyurt, S. (2003). A literature review and integrative performance measurement framework for multinational companies. Marketing Intelligence & Planning, 21(3), 134-142.

Posted on 153 Comments

GIS, Virtualization, and Environmental Uncertainty – Part I

Changes in international business since the past few decades have brought greater internationalization and integration. The term globalization captured these changes with considerable impact in increased cross-border movements of goods, services, capital, technology, and people. Based on global integration and local responsiveness dimensions, four forms of organizations are used to manage international business: global, international, multidomestic, and transnational corporations.

 

This post focuses on Global Corporations (GCs). GCs prefer to market a standardized product worldwide for economical reasons while moving concentration of production, marketing, research and development activities to a few favorable locations. The issues around the expansion of business to a global level relate to the external environment of the organization as well as its internal environment.

 

The management of external environmental uncertainty is critical for the success of global corporations. The major sources of uncertainty in the external environment are the number of different forces that firms have to manage, the degree to which the external environment is changing, the resources available in the environment, and business continuity management of Global Information Systems (GIS).

 

GIS drive the information society and enables knowledge workers to connect and communicate in ways that drastically change their work.  Four main factors that generally influence decisions each organization make in designing and pursuing its GIS are: (a) interoperability, (b) total cost of ownership, (c) security, and (d) transparency and public right to information.

 

Recent earthquakes in Haiti and Chile, the violent European windstorm Xynthia, or hurricane Katrina in New Orleans (U.S.) are reminders to business and IT managers that preparedness to protect critical information systems and data against natural and man-made disasters, swift response, and quick recovery are necessary tools to assure business continuity.

 

Business continuity planning is about having plans and procedures in place to recover key business processes after a disaster. Participants in a recent business continuity management survey perceived failure of computer hardware or software and data loss as the highest risk to business disruption, with 21% of the executives stating that natural disasters such as storms, floods, and earthquakes were of particular concern. Disasters are not just restricted to fire, flood, and other causes of property damage; they can equally result from more mundane problems such as labor strikes, hardware, or software malfunctions. 

 

Virtualization, business impact analysis, redundancy, and offsite data centers are various approaches to ensure business continuity. Virtualization (or virtual machine technology) refers to a framework or methodology of dividing the resources of a computer into multiple execution environments, by applying one or more concepts or technologies such as hardware and software partitioning, time-sharing, partial or complete machine simulation, emulation, quality of service, and many others.

 

As companies exploit the growing possibilities of international business, technology leaders must build consensus for an organizational structure that enables the expansion of information systems. The purpose of the posts of this thread is to evaluate the expansion from the perspective of a Chief Information Officer and discuss the issues within the expansion scenario as they relate to environmental uncertainty, business continuity, and virtualization considerations.

 

Global Corporations and Global Information Systems

 

Information Systems (IS) organization refers to the combination of technologies, processes, people, and promotion mechanisms to improve the performance and effectiveness of the organization. IS affects nearly all aspects of human endeavor and also assists in the management and operations of various types of organizations.

 

Since the 1960s, managing and operating IS to improve organizational performance and effectiveness has been a field of practice. Firstly known as business data processing and later as management information systems, the field is currently referred to as information technology (IT). Ongoing innovations in IS and the growing worldwide competition add difficulties and uncertainties to corporate environments. Global information systems attract attention from both practitioners and scholars as it is a critical enabler of competitive advantage for international businesses.

 

Operational priorities of GCs requires innovative capabilities and creates new requirements on the IS function of GCs. Prior research categorized the requirements of GCs into four areas: (a) decreasing the cost structure, (b) increasing innovation, (c) leveraging information assets, and (d) becoming more agile.

 

Information systems are fundamental to effective global operations because it enables coalitions and provides a coordination mechanism for geographically dispersed activities. Information systems are disruptive phenomena for global corporations because of its capacities of changing the competitive landscape and enabling new organizational structures, products, processes, and ways of communication.

 

The nature and function of GIS should concur with the operational shifts of GCs which are highlighted above. The strategic use of global information systems (GIS) depends on the ability of corporate managers to appreciate the IT business value and use it as a competitive tool. GIS organizations must provide resources to lead and support IT-enabled business transformation initiatives by simplifying global operations, automating the streamlined processes, and relocating some business processes to lower cost locations. The increased focus on innovation in the business, for example, required GIS organizations to increase productivity, effectiveness of their research and development capabilities. The focus on agility and innovation created new demands on GIS organizations to provide rapid solutions to information management frameworks essentials to ensure intelligent and informed business decision making.

 

Virtualization

 

The idea of virtualization is to partition a physical computer into several logical zones. Each of these partitions can run a different operating system and function as if it was a completely separate machine. Virtual machine technology, or virtualization, refers to a framework or methodology of dividing the resources of a computer into multiple execution environments, by applying one or more concepts or technologies such as hardware and software partitioning, time-sharing, partial or complete machine simulation, emulation, quality of service, and many others.

 

The idea behind virtualization is an extension of what is found in a modern operating system (OS). A program running, for example, on a UNIX machine has its own virtual address space. From the program’s perspective, it has a large chunk (4GB on a 32-bit machine) of RAM to use. The operating system is responsible for multiplexing with other programs. This large and contiguous space does not exist in the real machine. Some of the space will be scattered around real memory while the rest of it might be stored on a hard disk.

 

Memory is not the only resource virtualized with a modern OS. The CPU is usually allocated to different processes using some form of pre-emption. When a process has used its fair share of the CPU, it is interrupted and another is allowed to take its place. From the process perspective, it has a CPU of its own (or more than one, as in the case with the duo core or quad cores).

 

Virtualization is not a new technology. In the 1960s, IBM developed a handful of virtual machine systems including the CP-40, CP-67, and VM/370. In all of these instances, a virtual machine monitor (VMM) ran between the application and hardware layers. Through the utilization of this VMM, multiple virtual operating systems could be created, utilized, and shut down without interfering with other virtual machines using the same VMM. This research placed IBM at the forefront of the virtualization race and is acknowledged along with the research assistance from MIT, as the foundation of modern virtualization.

 

Virtual machines are implemented in various forms: mainframe, open source, paravirtualization, and custom approaches to virtual machines, which were designed over the years. Complexity in chip technology and approaches to solving the x86 limitations of virtualization have led to three different variants of virtual machines: (a) software virtual machines, (b) hardware virtual machines, and (c) virtual OS/containers.

 

Software virtual machines manage interactions between the host operating system and guest operating system (Microsoft Virtual Server 2005). In the case of hardware virtual machines, virtualization technology sits directly on host hardware (bare metal) using hypervisors, modified code, or APIs to facilitate faster transactions with hardware devices (VMWare ESX). EMC’s VMWare technology is the market leader in x86 virtualization technology. The VMWare solution is more costly, but it provides a robust management console and full-virtualization support for an array of guest operating systems including Solaris, Linux, Windows, and DOS. In this case of virtual OS/containers, the host operating system is partitioned into containers or zones (Solaris Zones, BSD Jail).

 

There are several vendors in the virtualization technology and each comes with its own features which makes it adaptable for various scenarios. Some virtualization technologies are (a) Microsoft Virtual Server or Hyper V; and (b) EMC’s VMWARE suite (VMWARE workstation, VMWARE server, VMWARE ESX, and Vsphere.  Whereas the VMWARE suite is adaptable to most operating systems including Novel and UNIX, Microsoft virtual server is proprietary.

 

The huge number of centralized services and processing power in data centers in GCs headquarters are the reasons for an adequate virtualization. Virtualization reduces the number of servers, costs in maintenance and server management, costs in power consumption and cooling costs.

 

Business continuity and disaster recovery planning is the other main reason why GCs are virtualizing their services. Business continuity planning is the elaboration of plans and procedures in place to recover key business processes following a disaster. The plans and procedures for a business continuity planning process encompass (a) business impact analysis, (b) backup and restoration strategy, (c) redundancy, (d) offsite data centers, and (e) virtualization.

 

Virtualization, GIS, and Management of Environmental Uncertainty

 

Environmental uncertainty is a central issue for the deployment of global information systems. Uncertainty refers to events the organization cannot forecast. The major sources of uncertainty in the environment are usually the (a) complexity and the number of different forces an organization has to manage, (b) dynamism or the degree to which the environment is changing, and (c) richness or the amount of resources available in the environment. The accurate perception of uncertainty emanating from the environment is critical to organizational performance, organizational structure, firm strategy, and business continuity and disaster recovery planning.

 

Natural disasters can produce both horrifying and stunning tales of human tragedy and triumph. But after the initial dust has settled, an after shock experience materializes as businesses struggle to resume their operations. The Gartner Group noted that 43% of such companies were immediately put out of business by a major loss of computer records, and another 51% permanently closed their doors within two years leaving a mere 6% survival rate.

 

Information systems are fundamental to effective global operations because it enables coalitions and provides a coordination mechanism for geographically dispersed activities. From the business continuity and disaster recovery perspectives, the strategic use of GIS depends on a proactive business continuity planning of IT executives. Business Continuity Management (BCM) programs ensure that organizations adopt best practices through industry certification standards such as the British standard BS 25999-2: 2007. This standard specifies requirements for establishing, implementing, operating, monitoring, reviewing, exercising, maintaining, and improving a documented BCM system within the context of managing an organization’s overall business risks.

 

Virtualization, server consolidation, storage, remote access, security, and green initiatives are among the various challenges companies face with expansion of IS at a global level. Organizations are primarily deploying virtualization to improve server and system utilization rates, increase server reliability and uptime, and enhance business continuity. I believe that successful virtualization of GIS depends on the approaches adopted and the ability of measuring the performance of the virtualized environment.

 

The purpose of the next post (Part II) will be to explore approaches of virtualization of GIS and identify performance measurement indicators of virtualized global environments.

 

Your thoughts?

Posted on 299 Comments

Foucault and the Critique of Modernity


T
his post is not a biographical work on Michel Foucault, but a quick sketch of his life and the environment in which he was educated and these help to better understand the philosophical (since his academic formation is psychology and history) aspects of his works: (a) research and analysis of philosophy’s traditional critical project in a new historical manner, and (b) critical engagement with the thought of traditional philosophers.

 

Michel Foucault (1926-1984) is one of the French figures usually associated to the radical postmodern philosophies. Despite his bourgeois origins, he sympathized early with vulnerable groups such as artists, homosexuals, and prisoners. Like many young thinkers of his generation, Foucault was largely influenced by the (a) French tradition of history and philosophy of science represented by Georges Canguilhem, (b) French avant-garde literature with the writings of Georges Bataille and Maurice Blanchot, and (c) philosophical milieu and its methods of writing history based on archaeology and genealogy techniques.

 

The purpose of this post is to explore and reflect upon Foucault’s critique of modernity. First, an analysis of his historiographical approaches (archaeology and genealogy) is provided. Second, Foucault’s postmodern perspectives on the nature of modern power and his argument that the modern subject is a construct of domination are explained. Third, the political implications of Foucault’s genealogical method are analyzed as well as his work on technologies of the self. Finally, by taking the examples of institutions and technologies, this post provides some indications of the conservative aspects of Foucault’s work.

Postmodernism and the Critique of Modernity

 

Modernism was the cultural revolution of the 20th century, which is as important as Romanticism for the 19th century and the Enlightenment for the 18th century. The word modern has its origins in the early medieval modernus, meaning that which is present, of our time, and by extension, new, novel, or contemporary. From about 1900 to the 1960s, modernism reigned as a succession of varied movements and styles that reacted against historicism and recognized individual perception and experience as the cornerstone of the creative process.

 

Postmodernism arrived in the mid-1960s and reached its apex in the early 80s. Postmodernism is an intellectual current that rejects the Enlightenment project of modernity. This involves, among others, a radical critique (and often uncritical rejection) of objectivity, the a priori subject as source of meaning, authenticity, and authority, the importance of truth and abstract reason, the teleological approach of history, the universalizing grand narratives that aspire to completeness and the distinction between high and low culture. For postmodernists, science is nothing more than a narration, myth, or a social construction. 

Analysis of Foucault’s Historiographical Approaches

 

Archaeology and genealogy are the two approaches Foucault applied to his critique of historical reason. To understand these historiographical models, one should trace the evolution of philosophy from its beginnings with Socrates (and his project of questioning knowledge) to Kant (for whom philosophy is the critique of knowledge) through Descartes (rationalism), Locke (empiricism), and Hume (induction). For instance, Hume thought that expectations are built up based on recurrent experiences that the world in the future will be similar to the past, without any knowledge of anything. For Kant, reality is the sum of what can be experienced. He added that the mind has a set of rules for how experience must be constructed. Kant concluded that the rules must always apply to reality.

 

Foucault rejected this prescriptive definition of knowledge that establishes a set of conditions which, if met, would equate knowledge with truth, making it certain and definitive. He created a set of rules which can account for how men, at a specific time and place and in particular domains do produce knowledge, separate this knowledge from error, opinion, and beliefs. Foucault did not only accept the scandal of existing knowledge (men at different times and places have known differently), he made this scandal the focal point of his analysis, seeking to identify (using archaeology) the historical conditions of possibility of knowledge.

 

The history of knowledge, he argued, can be written only on the basis of what was contemporaneous with it, and certainly not in terms of reciprocal influence but in terms of conditions and a priori’s established in time. By using Nietzsche’s genealogy, he described his conception of history as genealogy by delegitimizing and proving the objectiveness of the present and the foreignness of the past. Foucault rejected any form of global theorizing, avoided totalizing forms of analysis and was critical of systematicity. He showed that ideas are usually taken to be permanent truths about human nature and society changes in the course of history.

Knowledge, Power, and Foucault’s Perspectives

Foucault’s theory of power is opposed with classical approaches based on a juridico-political conception of power (Hobbes, Machiavelli) or on class oppositions and domination (Marx). Foucault’s works explored the shifting patterns of power within a society and the ways in which power relates to the self. This led to different appearances of power such as disciplinary power, bio-power, governmental power and repressive power. Discipline and Punish followed Madness and Civilization and The Birth of the Clinic was the next stage in Foucault’s massive project of tracing the genealogy of control institutions (asylums, teaching hospitals, and prisons) and the human sciences were symbiotically linked with them (psychiatry, clinical medicine, criminology, penology).

 

The main concern of Foucault throughout his publications was the relationship between knowledge and power and the articulation of each on the other. Nietzsche thought that a will to power motivates human behavior and that traditional values had lost their power over society. For Foucault, following Nietzsche, knowledge ceases to be a liberation and becomes a mode of surveillance, regulation, and discipline. Foucault opposed the humanist position that once we gain power, we cease to know (because it makes us blind) and that only those who are no way implicated in tyranny, can attain the truth. For Foucault, such forms of knowledge as psychiatry and criminology are directly related to the exercise of power. He added that power itself creates new objects of knowledge and accumulates new bodies of information.

 

Technologies of the Self and Political Implications of Foucault’s Genealogical Method

 

The third major shift of Foucault’s work is the focus on technologies of the self, ethics, and freedom in the 1980s. Technologies of the self are practices by which subjects constitute themselves within and through systems of power. These systems often seem to be either natural or imposed from above. Foucault theorized that the body is a subject of technologies of power. These technologies are established through discourses of “expertise” such as medicine, law, and science. Through these discourses or truth games, individuals develop knowledge about themselves, while bodies become the site of domination through technologies of power, practices of discipline, and surveillance.

 

Foucault’s work on modern power and government inspired other works (for example, neoliberalism of the New Right) to explore politics and political institutions. Similar to Foucault’s genealogies, most of these works embody hostility to the humanist notions of the subject and truth. This hostility sets up various themes which can be seen as constitutive of a Foucauldian approach to the study of political institutions. These themes can be found in Foucault’s work on power and government. They can be divided into those arising from a critique of traditional structuralism, a critique of the subject, and a rejection of objectivism.

 

Foucault’s genealogies provided examples for a political science that would take seriously the anti-foundationalist view that we have neither pure experiences nor pure reason. Such view certainly overlaps considerably with Foucault’s concern to decentralize structures, analyze the ways in which individuals are constructed by their social context, and renounce appeals to a natural or immanent reason.

 

Critical Comments on Foucauldian Perspectives

 

Having outlined some of Foucault’s arguments against technologies and institutions, the first criticism of his work is that he refused to see any advantage in modernity in some domains like medicine. Unlike Habermas who thought that science is unproblematic when it operates according to the rules of right, Foucault failed on repressive forms of rationalization and never delineated some progressive aspects of modernity. For him, all aspects of modernity are disciplinary, which is quite difficult to accept. Foucault’s analysis did not focus so much on the question of right but rather on the mechanisms through which power effects are produced. Instead of fixing the legitimacy of science or asking what the proper domain of certain knowledge is, Foucault examined the role of certain knowledge in the production of effects of power.

 

The second criticism of Foucault’s work is that he disregarded the fact that domination has its basis in the relations of production, exploitation, and in the organization of the state. In line with Poulantzas’ criticisms, one can note that Foucault neglected to study the modern form of the state and its derivation from capitalist perspective of production. He did not see that all social phenomena always occur in relation to the state and class division. He exaggerated the importance of disciplinary techniques in the modern state and thus neglected the continued importance of violence, legal coercion, and law in general. Unlike Poulantzas who saw some virtues (reproducing consent) to law and state (involves in constituting social relations and winning mass support), Foucault emphasized only the repressive, prohibitive side of law and the positive productive side of (state) power.

Conclusion

Foucault’s work can be summarized in three major shifts from the archaeological focus on systems of knowledge in the 1960s, to the genealogical focus on modalities of power in the 1970s and to the focus on technologies of the self, ethics, and freedom in the 1980s. Foucault contributed in many fields in the humanities and social sciences. As a member of postmodernist movement and in line with their deconstruction paradigm, he tried to show the problematic and suspicious aspects of rationality, knowledge, subjectivity, and the production of social norms. He thought that the quest of power invaded social and personal life and pervaded schools, hospitals, prisons, and social sciences. Foucault saw a link between power, truth and knowledge and he argued that liberal-humanist values became entangled with, and the supports of technologies of domination. He criticized both macro theorists who see power only in terms of class or the state, and micro theorists who analyze institutions and face-to-face interaction while ignoring power altogether.

Posted on 296 Comments

Moral Leadership: President Obama as a Role Model?

 

The U.S. Senate passed a historic $871 billion health care reform bill on December 24, 2009, handing President Obama a Christmas Eve victory on his top domestic priority. Should it become law, the measure would constitute the biggest expansion of federal healthcare guarantees since the enactment of Medicare and Medicaid more than four decades ago. It is expected to extend insurance coverage to 30 million additional Americans.

 

Fourteen days earlier, the Norwegian Nobel Committee awarded on December 10, 2009 the peace prize to President Obama “for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples” and that they have attached special importance to Obama’s moral leadership.

Leadership always calls forth images of vision, courage, commitment, and forthright action. The leaders are usually considered as individuals who possess a clear vision of what need to be done and are capable of transforming their visions to substantial achievements. Leaders can demonstrate accuracy in their actions, enabling to defeat all obstacles and opponents to achieve their goals.

Leadership has been interpreted in various ways. Beyond those conceptions, that notion can basically be considered as an art of transforming others in the manner desired by the leader and depending of the environment. Leadership refers to a process of social influence in which an individual influences others to transcend personal interests in the accomplishment of common objective. This idea of transformation conveys the premise that leaders can profoundly alter both followers and themselves for the good if they exhibit effective behaviors, deploy the correct techniques and actions, demonstrate high ethics and imagine long term goals in a suitable environment.

 

Some scholars showed that there is a difference in kind between the exercise of power and the exercise of leadership, and that the difference is a moral one. The ultimate test of moral leadership is its capacity to transcend the claims of the multiplicity of everyday wants, needs, and expectations by responding to the higher levels of moral development. The transforming power to moral leadership requires a certain kind of character and a certain kind of wisdom in relation to that character. When leaders apply moral standards, public interests may supplant personal interests and the correctness of a decision depends of the soundness of reasoning that justifies the leader’s actions.

 

Scholars and policymakers interested in the human impact on society should read Lives of Moral Leadership by Robert Coles. This book underscores the role and determination of the individual in altering negatives in society and culture. Coles attempted to identify leadership attributes and show when and how such personalities can make an impact on environmental, institutional, and elite structures. He successfully demonstrated that leaders can impact the world through their courage, abilities, knowledge, example, and moral grounding.

 

A close look of the short public life of Barack Obama through his thoughts, decisions, and actions is deserved to grasp the message that true leaders work for justice.  The Obama administration allowed, for example, unlimited travel to Cuba by Cuban Americans and lift limits on transfers of money to relatives on the Caribbean island. In addition to easing travel and remittances, the new rules expanded the list of gifts Cuban Americans can send to their families in Cuba and allowed U.S. telecommunications companies to do business there. Such decisions are simply morally correct!

 

Moral principles are tightly associated to the meaning of life. As human beings, we live either by no genuine moral rules or by absolute ethical principles. Ethical principles are either relative or absolute. We are always challenged to think about the various ways in which we make sense of ourselves, the society in which we live, the world around us, and the relationship to it. We usually analyzed current and classic treatments of meaning and sense-making in the philosophical, psychological, and cultural beliefs. After all, every human being constructs a fundamental philosophy of the basis of life, a theory of the relation between the individual and the society. This philosophy shapes the individual whole attitude of life.

 

Perhaps at no other time in recent history has the question of moral leadership been so acutely relevant. The global financial meltdown, the diplomatic struggle of the United Nations to reach agreement at Copenhagen on climate change, the decision of Iran to boost its nuclear program despite sharply increased concerns of Western governments and the United Nations, and the massive anti-globalization protests around the world all dramatized issues of moral leadership at individual, institutional, national, and international levels.

 

I salute Obama’s positive thinking by maintaining a non-judgmental approach to both other people and situations, maintaining self-assurance about oneself, being optimistic and communicating effectively. The problem for him is that, because of all his capabilities, failure is not an option. President Obama can not let us down!

Posted on 172 Comments

Web 2.0: Toward an Open-Content Movement?

 

A friend of mine, a brilliant economist, asked me recently if there was a new version of the Web called Web 2.0 and what could be the possible link with the new Internet Protocol, IPv6. My response was the following: Web 2.0 is a new version of the Web that replaces Web 1.0 and there are no (for now!) direct implications with IPv6. Although the response to my friend’s question is correct, I did not really share with my friend the philosophy behind the emergence of Web 2.0 and all technical implications of this move.

 

Web 2.0 first emerged in 2004 during a meeting held between O’Reilly Media (a technology firm) and MediaLive International (a conference planning firm) while organizing a conference about the Internet. Initially, the term Web 2.0 was nothing more than an attractive name designed to emphasize the evolution of the Internet. The difference, for example, between IPv6 and Web 2.0 is that the former refers to an articulation of the state of the Internet Protocol (to replace IPv4) and the latter is a combination of a number of disparate ideas, practices, and programs.

 

The main variation between Web 1.0 and Web 2.0 is that Web 1.0 delivers information to people whereas Web 2.0 allows the active creation of information by users. The development of Web 2.0 applications involves both the developers and users. Myspace, Digg, YouTube, Twitter, Facebook, and Wikipedia are all examples of sites that facilitate collaboration between content generators and content users.

 

Web 2.0 technologies are open-source tools that foster collaboration and participation. Web 2.0 tools facilitate the publication and storage of textual on blogs and wikis as well as podcasts (audio recordings) and vidcasts (video materials). Web 2.0 principles from a collaborative perspective include individual creativity, harnessing the power of the crowd, diverse data on an epic scale, architecture of assembly, independent access to data, and so on.

 

Web 2.0 technologies can be synthesized into two categories: (a) applications and programmatic considerations including Asynchronous JavaScript and XML (AJAX), blogs, Really Simple Syndication (RSS), Cascading Style Sheets (CSS), social software, and wikis; (b) function and conceptual design covering usability, participation, remixability, focus on simplicity, joy of use, convergence, and mobility.

 

Web 2.0: A new Philosophy

 

More than the development of models and tools, Web 2.0 is a new philosophy behind the evolution of the Internet. This philosophy is similar to the open-source movement initiated by Stallman during the 1980s. Even though the open-source movement began in the early 1960s with the advent of ARPAnet project of the US Defense Department, the roots of the contemporary open-source odyssey are connected directly to Stallman’s GNU on a free, libre, open-source alternative to proprietary versions of the Unix operating system (OS). Stallman’s GNU project spanned the course of a decade and resulted in a number of initiatives eventually licensed for distribution under the GNU General Public License (GNU GPL). Linux OS (developed by Linus Towards with its first Linux release in 1991), Navigator of Netscape in 1998, MySQL, Firefox, PHP, Ruby, Perl, Python, or Joomla (I am using to develop this site!) are few FLOSS and open-source initiatives.

 

Free, libre, and open source software (FLOSS) is software that is liberally licensed to grant the right of users to study, change, and improve its design through the availability of its source code. Stallman developed the free software definition and the concept of copyleft to ensure software freedom for all.

 

Software drives the information society. Software enables us to connect and communicate in ways that drastically changes how we work and play. Software facilitates productivity, at the same time, delivers the digital lifestyle. The open-source movement has gained both momentum and acceptance as the potential benefits have been increasingly recognized by individuals, corporate players, and governments.

 

A 2003 survey of open-source developers conducted by scholars at Stanford University revealed that the spatial distribution of the open-source movement has evolved into a global phenomenon covering all continents including Africa. Many governments (Peru, Venezuela, Vietnam, Malaysian, India, Germany, Dominican Republic, Ecuador, French, etc.) adopted open-source software in most of their agencies. Today, the majority of companies move to FLOSS.

 

After the open-source movement, we are acknowledging a new paradigm shift with Web 2.0. This new gestalt (from Kuhn’s perspective) is what I called “The Open-Content Movement”. The purpose of the movement is to facilitate collaboration and participation toward a global collective intelligence or, to wink at the Fishers, a global distributed mind.

 

The main idea of distributed mind is that multiskilling for each individual tends to be less important for knowledge teams than putting together the right team of people who collectively have multiple skills. The contribution of computer-mediated communication system (CMCS) to Fishers’ distributed mind is to facilitate the collaboration between virtual knowledge teams.

 

A CMCS is the use of the computer to structure, store, process, and distribute human communications. A CMCS is frequently used for asynchronous text-based communication, meaning that the participants are distributed in time and space. It can also include graphics or digitize voice, as well as real-time (synchronous) exchanges such as chats and instant messaging (for example, Lotus Notes Sametime, Skype, Yahoo Messaging, and so on). The most common forms of CMCS are electronic mail, computerized conferencing, and bulletin board systems.

 

The global distributed mind from the new open-content movement will illustrate the capacity of people of the world to participate and collaborate. It will demonstrate the capacity of people to share a collective world’s culture character that represents their cultural mental programming. After all, this is all about globalization: an integration of the world’s culture, economy, and infrastructure through transnational investment, rapid proliferation of communication and information technologies, and the impact of free-market forces on local, regional, and national economies.

Posted on 341 Comments

Distributed Knowledge

Current innovations in information and communication technologies (ICT) provide new opportunities for engaging in geographically distributed work (Hinds & Mortensen, 2005). A workforce is distributed if: 1) knowledge workers operate in different physical locations, 2) team members communicate asynchronously for most normal interchanges even with collocated colleagues, or 3) team members work with different firms or within different entities of the same parent organization (Ware & Grantham, 2003). The term distributed teams refers to teams that rarely use face-to-face communications because they are geographically dispersed and linked by ICT (Townsend et al., 1998). In this socio-technical perspective (Hazy, 2006), distributed teams are synonymous to virtual teams.

According to Curseu, Schalk, and Wessel (2007), the concept of distributed teams can be defined through three dimensions: 1) the degree and type of interdependence among teams, 2) the nature of teams (temporary or permanent), and 3) the extent to which teams rely computer-mediated communication systems. These dimensions and the characteristics of the various types of distributed networks affect information processing, information flows and knowledge transfer (Curseu et al., 2007; Arling & Subramani, 2006).  The purpose of this essay is twofold: 1) identify five criteria that should be considered when choosing tools to connect distributed teams, and 2) describe how the knowledge created by these teams should be captured and used.

Connecting Distributed Teams: Models and Tools

As mentioned earlier, the construct of distributed teams refers to teams that rarely use face-to-face communications because they are geographically dispersed and linked by ICT. From the three dimensions of distributed teams identified by Curseu and his colleagues, two main configurations of distributed teams exist: 1) collocated teams with few remote team members and 2) multiple geographically distributed subgroups. Members of distributed teams experience a mix of communication modes such as face-to-face interaction and electronic communication (Arling & Subramani, 2006).

Distributed computing environments link teams and resources dispersed across networks. Distributed systems consist of several interoperable multi-platform processing components including operating systems and hardware architectures (Gunwani, 1999). These differences are masked by middleware technologies that provide additional services to solve some issues inherent to programming in a distributed environment such as transactions, naming, security, and reliability (Gorappa, 2007). The foundation for distributed computing systems is N-tier client-server systems (Hoganson & Guimaraes, 2003). In fact, distributed database systems are client-server systems that provide concurrent access to clients to data stored in various servers through a distributed database and a distributed database management systems (Cavazos & Jarquin, 2004). Some characteristics of distributed systems are openness, resource sharing, scalability, concurrency, transparency, and fault tolerance.

In recent years, the demand for Internet-based distributed systems and applications has expanded rapidly. These technologies and applications are cluster computing, grid computing, web services, mobile systems programming, distributed algorithms, sensor networks, DCE, CORBA, J2EE and .NET industry standards. The expectation is that they enable the creation of new types of enterprises and services by virtualizing resources that are geographically distributed (Buyya & Ramamohanarao, 2007). Kurdi, Li, and Al-Raweshidy (2008) used six criteria to categorize distributed systems: size, solution, interactivity, accessibility, manageability, and user-centricity. Depending of the types of services delivered, these systems might be computational, global, interactive, mobile, voluntary, personalized, and automatic, whereas another might be data oriented, project based, for batch processing, restricted, centralized, and nonpersonalized.

Knowledge-Based View of the Firm and Knowledge Management Systems

The creation of a global society with possibilities of knowledge sharing is among the contributions of the IT revolution and globalization. In the knowledge society, the value-creating strategies and long-term viability of a firm depend on sustaining its competitive advantage. The knowledge-based view of the firm draws upon the resourced-based view (Levitas & Ndofor, 2006; Williamson, 1957; Chandler 1962; Stigler, 1961) and considers knowledge as a distinctively unique resource that should be managed. Organizational knowledge can be characterized as explicit and tacit (Regan & O’Connor, 2002), and embedded (Bourdeau & Couillard, 1999). Knowledge management (KM) refers to the ability to create and manage a culture that encourages and facilitates the creation, appropriate use, and sharing of knowledge to improve organizational performance and effectiveness (Walczak, 2005).

Organizational KM includes the identification, acquisition, storing, and dissemination of tacit, explicit, and embedded knowledge. Conceptualizations of knowledge management (KM) as well as of intellectual and human capital in organizational design are usually guided by various perspectives such as information-processing theory (Tushman & Nader, 1978; Galbraith, 1973), organizational learning theory (Senge, 1990), knowledge creation (Kearns & Sabherwal, 2007), dynamic capabilities (Collis, 1991), and resource-based theory of the firm (Rugman & Verbeke, 2002; Wernerfelt, 1984; Penrose, 1959).

Good KM, as Charles (2005) noted, involves three elements: people, processes and technology. Organizational technologies that support KM initiatives and KWs are called knowledge management systems (KMS). KMS are IT-based tools developed to support corporate processes of knowledge management (Feng, Chen, & Liou, 2005). KMS are classified in terms of knowledge dimensions (tacit and explicit) and the extent of codifiability required (Becerra-Fernandez, 2000), codification versus personalization strategy (Hansen et al., 1999), KM processes that are supported (Alavi and Leidner, 2001; Tiwana and Ramesh, 2000). Benbya and Belbaly (2005) have provided a classification of KMS based on the tacit and explicit dimensions. Examples of such applications are knowledge bases, business intelligence services, corporate information portals, and customer relationship management services. Five indicators are used to measure their success (Benbya & Belbaly, 2005): 1) system quality, 2) knowledge quality, 3) use and user satisfaction, 4) perceived benefits, and 5) net impact.

Conclusion

Continued globalization, coupled with the technology revolution, has changed the way many corporations operate. Organizational design and change are not easy in the increased global competitive pressures combined with the increasing use of advanced IT. The future of work and the business success depend directly on an organization’s ability to redefine its business strategies, workplace, workforce, and technology. Geographically distributed teams are increasingly the new workforce and workplace strategies that give firms the required agility and flexibility to meet dynamically changing needs in the volatile contemporary business environment. This essay discussed criteria that should be considered when choosing technology systems to connect distributed teams. Based on the fact that knowledge is considered the main source of competitive advantage in today organizations, this essay also described IT-based technologies that are available to support corporate processes of knowledge management.

The achievement of corporate agility and flexibility goals require a deep rethinking of the missing element among the four identified above, that is, business strategies. To improve corporate internal features, appropriate management and leadership approaches should be implemented. On the other hand, geographically distributed teams generate new types of issues. These global issues could be handled by using the contingency theory that emphasizes that design decisions depend on environmental conditions and are guided by the general orienting hypothesis that organizations whose internal features are aligned with the demands of their environments will increase organizational performance and minimize uncertainty.