ABSTRACT
The Knowledge-Intensive Sustainable Evolution Dynamics (KISBED) (patent pending), a platform the authors use in their “use-cases,” shows that it works. Cyber, infrastructure, and product are integrated in the Cyberinfra Product “function.” The perception properties are not long tagged or have no car- riers, and the signal travels a short distance before it collides. The authors prove the KISBED through some examples.
INTRODUCTION
One branch of the philosophy of science, methodology, is closely related to the theory of knowledge. It discovers the methods by which science get there at its posited truths concerning the world and critically explores alleged foundations for these methods. In industry, they view that some crucial concepts that arise from the scientific knowledge, are of any use while sensing the credibility of its outcome. Truth is that the industry is hectic in their day-to-day arena. The performances are
targeting to get a product out of the process within 90 days. We have seen this problem today with cell-phones, Internet-games and home-theaters etc. Therefore, the research and development team and their strategies must have to be focused on to a queue of potential versions of products and services. In fragile demand entangled, based on market segments. In any circumstances a product would be a mini robot, camera or cell phone. They could be large scale mechatronic or process equipments such as fighteraircrafts, paper machines, and turbines, etc.
DOI: 10.4018/978-1-4666-4225-6.ch017
Copyright © 2013, IGI Global. Copying or distributing in print or electronic forms without written permission of IGI Global is prohibited.
Scientific methodologists over and over again state that science is characterized by convergence. This is the claim that scientific theories in their historical path are converging to an ultimate, final, and ideal theory. However, sometimes this final theory is said to be true since it corresponds to the “real world,” as in pragmatic accounts of convergence. It would like that the exact nature of light is a profound question, which seems to be not yet fully answered. Luckily, one does not need to know exactly what light is in order to understand how it behaves and utilize it (Pillai, 2011). This is a bridging effect between “superfast” and “cyber- infraproduct concept,” which we are presenting here. Further on to light the topic. There are two convenient ways to describe the propagation of light and its interactions with materials. Neither system is sufficient alone nor are they completely ample together. At least the two systems are not contradictory. In the age of ubiquitous Internet; intelligent devices, and mobile displays that are good for more than just ruining your eyes, but also companies who can no longer afford to sidestep the expanding mobile market (Pillai, 2012).
Engineers have to design their products ingeniously to avoid any “technical dead-locks,” and assure the security of its use. In those products, it is necessary that it should perform well. At any level, the wave train of radiation can be completely described by two vectors that are perpendicular to the direction of travel of the ray, which is again perpendicular to each other. In this context, it is perhaps in order to say that in decades we learned to produce fighteraircrafts, to use them strategically and counter attacking probes produced widely around the globe. In reality, they are mechatronic-products, and to some extent, they express themselves as to be the romantic yield – as robotic engineering. Here we made the bridge. Nevertheless, let us now focus on the cyberinfra product concept at this chapter. Let us begin in putting the ingredients to essential, and or potential theory-mixes. We identify and build the product-platform type. We plan to map out the prototyping strategies, and test-bed opportunities.
“‘If I have seen farther, it is by standing on the shoulders of giants,’ wrote Isaac Newton in a letter to Robert Hooke in 1676” (Hawking, 2002). While Newton was referring to his discoveries in optics rather than his more important work on gravity and the laws of motion, it is an appropriate comment on how science, and unquestionably the whole of civilization, is a series of incremental advances, each structure on what went before. This is a fascinating theme. This statement consistently discloses the interoperability between the gravity of science and civilization. To trace the evolution of today’s portrait of the businesses from the grav- ity of science; is the avant-garde state of Internet liberation. Due to this and several other covering law models1, the view of scientific explanation as a deductive argument which contains non-vacuously at least one universal law among one its premises. There by to scrutinize the business laws, this is looking for scalable, adaptive, cost-effective, col- lective, and pinpointed solutions (CE-NET, 2010). Clearly, there is a need to develop a systematic and holistic approach. Virtual Enterprise is the answer to it. It is triggered with ubiquitous (it is anywhere, anytime) technology. This technology is within your reach (meaning easy and cost-effective), produced it and at the collaborative environments. The system is able to maintain the overall secu- rity. The interoperability is as well tagged to the system. Ultimate goal is to cement the vision of the business world, in turning them “inside-out,” as a plug-and-play Internet business community (Pallot, Salminen, Pillai, & Kulvant, 2004). In the practical terms, so-called the “Extended or Virtual Enterprise” approaches would create a life-size dilemma in the future. This is because of each time, when a new partner is entering in to the system, those results increasing the job of management potentials including its integration costs. This is mostly due to value clogging visions and or miss-interpretations, which dragons to ideal collaboration among trading partners.
Recently there has been a great interest in the Semantic Web and issues related to specification and exploration of semantics on the World Wide
Web. Berners-Lee, the initiator of the World Wide Web (www), lays stress on the importance of the “Semantic Web” for machine-understandable Web contents and emphasizes the need for ontology (Berners-Lee, Hendler, & Lassila, 2001). Though a self-describing protocol extended Mark-Up Language (XML) is in charge of the syntax of the Web contents, clear definitions of the semantics of the domain knowledge required to implement machine-understandable contents. Ontology and ontological analysis needed fundamentally to represent knowledge about the domain and be able to share the information (Chandrasekharan, Josephson, & Benjamins, 1999). In specific, the shared ontologies are being proposed for represent- ing the core knowledge that forms the foundation for semantic information on the Web. We identify broad thrust related to ontologies:
- Approaches to standardize the formal se- mantics of information to enable machine pr Work is to be done as a part of the W3C working group.
- Approaches to define real world semantics linking machine processable content with meaning for humans based on engineering terminology.
Creating standards, especially standards that generate information industry infrastructure, is difficult, time-consuming and at constant risk for irrelevance and failure. One way to mitigate this risk, and secure the participation of the diverse interest groups required making such standards a success is to focus on process—as in the process that produces and maintains a good standard. This is in contrast to an approach that says some existing artifact selected from a list will be the standard, and all the others will not be the standard. An observation that we attribute is that it does not matter where you start, that is, it does not much matter which terminology or terminologies one selects as a starting point, instead what does matter is the process by which the proposed standard evolves to
achieve and sustain the desired degree of quality, comprehensiveness, and functionality. The process is what determines where the standard ends up.
We have seen in this light, change, even a large amount of change, will be a feature of successful formal terminology, or ontology. We hope to demonstrate the feasibility and utility of this approach. The challenge in the context of the Semantic Web is to choose a representation for change that makes it explicit. We assume that the engineering tool kit viewed in this way the Semantic Web would be part of the semantics. The challenge with this approach is the formulation of the units of change and the creation of ontology of these change units. Making change part of the Semantic Web would preserve that consistency. One way to focus the development of the desired units, inter- relationships, and uses is to solve real problems and gain experience from deployments of these solutions; we propose to do this by formulating, deploying, and evaluating what we now call “The New Engineering Transaction.” This transaction needs to supply reverse engineering; operating schemes and control information systems with the requisite formal definition of a new “insert,” given a reference model or application, and do so at Web scale. The main challenge is how to do this in a way that first avoids breaking working applications that use the engineering tool kit terminology and second preserves the longitudinal value of existing and future engineering innovations of implementation.
At a deeper level, we believe that Semantic Web is an opportunity to shrink the “formalization gap” between engineering disciplines. We argue that overcoming this gap is the fundamental change in engineering society. This discontinuity in formalization between human connected information process and the machine code necessary to accomplish comparable ends begins at a very high descriptive level and it is not itself a concern of computer science. If this concern is to be given a name at all, it must be regarded as concerning engineering applications, and it is increasingly
being referred to as “engineering information science” in Finland and “Engineering Informatics” in Europe and the US. It will be the task of this new discipline to better understand and define the engineering information processes we have considered here, in order that appropriate activities will be chosen for computerization, and to improve the man-machine system for better interoperability.
DEFINITION OF CYBERINFRA PRODUCT CONCEPT
It is essential to define the cyberinfrastructure as a set and product concept. The definition here for methodical purposes and also for the chapter in general is at first, what are cyber, infrastructure; and product concepts in sequence.
- Cyber: Is a virtual imitation of things that are anywhere and everywhere It would that who exercises the surveillance. It would produces, stimulate, propagate and polarize the signals; through all the possible media. It is simply a carrier, an agent.r
- Infrastructure: Is as networks’ like traffic roads or nodes; employing the communications that is transported by the Cyber. We combine cyber and infrastructure, we get the Cyberinfrastrur In short, let us call it as “Cyberinfra.”
- Product Concept: Is proof of some idea or knowledge which turned out to be a physical product2. This product has a thought or concept, based on its function, behavior, utilization, or segment of a community. While we integrate all those three words into “function” as our topic demands – it becomes as Cyberinfraproduct concept. We shall further define it. It has a meaning of “being everywhere, anywhere.” We call it, for this purpose, as cyberinfraproduct concept. This concept, as a technological niche, is rich with opportunities and challenges. This technology niche would be
able to access, in this case the entire signal trajectories that are reflected, communicated, and or fragmented. The frequency can be low or high in an environment. This environment can either be closed or opened loop, and or naturally or through a media or several spread signals and are capable to route the propensity, which is being processed, used collectively, individually, single-in, and or single-out mode. The perception properties are not long tagged or carriers. It travels a short distance before it collides. It is not necessarily, so to say, a physical product anymore neither it has any more a physical outlook nor it could simply be bits-bytes.
TECHNOLOGY ROADMAP FOR CYBERINFRA INTEGRATION
We summarize the vision of the change of Technology Roadmap. It visions a time-frame for 5 years. The mission accomplishes to fit a suitable level at 2010. This is due to NIST3. NIST was proposed early 2000 that the Semantic Web would become a reality in industry and a new “Knowledge Web”, that is new Internet, would revolutionize once again the world. Figure 1 describes an onion model of the roadmap towards enterprise integration and full use of semantic infrastructure in open system architecture with interoperability and plug and play capability.
CYBERINFRA ARCHITECTURE
The underlying difficulty is not with speed or quantity alone, but with relevance. How does a system, given all that it knows about aardvarks, Alabama, and ax handles, “home in on” the per tinent fact that bananas don’t get hungry, in the fraction of a second it can afford to spend on the pronoun “it”? The answer proposed is both simple and powerful: common sense is not just randomly
Figure 1. Technology roadmap – Cyberinfra integration
stored information, but is instead highly organized by topics, with lots of indexes, cross-references, tables, hierarchies, and so on. Therefore, it is necessary to have Cybrainfra architecture. The architectural notion or vocabulary in engineering at this point, is the information, which is stored either in a database or bases, and is connected to server or several servers. In this context let us use or reframe the terminology to get an answer the meaning of servers, which are ontology servers. Like words in the sentence itself trigger the “articles” on monkeys, bananas, hunger, and so on, and these quickly reveal that monkeys are mammals, hence animals, that bananas are fruit, hence from plants, that hunger is what animals or we human-beings feel when need to eat – and that settles it. The issue of relevance is solved instead by the antecedent structure in the stored knowledge itself. These types of servers “normal- ize” terminology functions for enterprises, some at Web scale. We believe that such servers will be, essential to support the Semantic Web, and as usual at the Web. The challenge will be how to maintain them in loose synchrony as appropriate.
The fact is that the formal terminologies would always construct and maintained by geographically divided domain experts. This means that we need additional things for “configuration management,” which supports the conflicting resolutions, and so
- One short-term reality is the need for what we call the “local enhancement”. In another words, the ontology builders and their server, must specify clearly the “business requirements” that are to be addressed for common use in electronic commerce and Business-to-Business, B2B. Let us turn to an Architecture that can be drawn for simplifying the concept (Figure 2).
Here the builder uses an object-oriented knowledge representation model based on and compatible with knowledge model and is designed to use the best practices from other frame-based systems. This is assumed to supports operations on class- es, slots, facets, and individuals. Interoperability, knowledge sharing, and reuse are important goals and works as a fully compliant server. This should support a meta-class architecture that is in an Open Environment to allow introduction of flex- ible and customizable behaviors into an engineering ontology. It is able to predefine certain system constants, classes, and primitives in a default up- per engineering ontology that can extend or refine to change the knowledge model and behaviors within the system.
By threading the vision, the study may in future concentrate on cutting-edge at industry cases. A flash search will be conducted on the Industry needs and their product development capability and Systems. While surveying the requirements
Figure 2. The architecture of engineering ontology
of an industry; we penetrate to verify the capabilities of Finnish Industries. We mirror them with US-industries and their peers, such as government and universities.
ROBUST DESIGN KNOWLEDGE AND CHALLENGES
We have now bridged and are able to process on further phases that are necessary to this chapter. Engineering Design Creation at this juncture of the chapter, one would define is the application of scientific and technical knowledge to solve a number of problems in and around a visible or imaginary product. Engineers incredibly use imagination, judgment and reasoning to apply; science, technology, mathematics, and practical experience in satisfying the required demand that exist, or being created in a market environment; at the time of a product invention. The outcome; is the design, production, and operation of useful objects or processes.
Business world is characterized by an increased demand for innovation, shorter product life cycles, and amazing pressures to launch new products fast. At the same time, product development teams face cost-cutting challenges. Even in the face of these pressures, no business can afford to sacrifice robust design. To fulfill the ultimate customer
promise, products must perform as expected in the true world, every day and in every circumstance. Salminen and Pillai (2004) defined and tested in their one of the works and supported this with a case study called “the Life-cycle Challenge Management (LcMgt),” at this phase. According to this work, it is pretty evident that the hectic industry is in that mode at this era of the century.
In mechanical engineering designs, it is known for their assembly and working that mechanical system contains complex assemblies of interconnected parts to function. Consider a large overall motion, such as at the ground vehicle suspension assemblies, robotic manipulators in their manufacturing processes, and aircraft landing gear systems. For a faster, more efficient solution to this problem class, ANSYS (n.d.)4 provides a rigid multi-body dynamics module. Figure 3, shows a Bladed disc and its assembly in a passenger aircraft used by Lufthansa Airlines of Germany.
Figure 3 identifies that the rotor dynamics applications in the sense of design, it serves to spot the behavior and diagnosis of rotating structures. The capability is commonly used to analyze the behavior of structures that is ranging from jet engines and steam turbines, to auto engines and computer disk storage. Rotor dynamics can effectively compute critical speeds and the effect of unbalanced loads on a structure. They allow creation of Campbell plots to identify critical
Figure 3. A bladed disc and the passenger air craft
speeds of single or multiple spool systems, for beams, shells and solid elements. The knowledge- intensive scavenge is, therefore, the most difficult scopes in smart product design. Today the three dimensional graphics (3D), are a critical asset to engineers in all disciplines. This would play count- less roles in product creation, from idea to proto- typing and to test data visualization.
The first law of thermodynamics, it says, “when work is expanded in producing heat, the quantity of heat generated is proportional to the work done, and conversely, when heat is employed to do work, a quantity of heat precisely equivalent to the work done disappears” (Goodenough, n.d.). In the case of an engine driving a tiny break, every millimeter of mechanical work done raises the temperature of the air surrounding the brake, or of the water used to cool it. In another word a smart product that is to survive in a cyberinfra, that would be opened and or closed. It needs a lot of potential knowledge that is too intensive enough to attract, say, signals surrounded and is able to process in feeding back-and-forth. Bandwidth is not bound to a critical fallacy. Such as the second law of thermodynamics, which is based on the fact that heat will not flow from a body of lower temperature to one of higher temperature. This fundamental law would be found responsible for the fact that no process of converting heat into work can ever be complete! Nevertheless, it is never possible or meaningful to make all of the given heat quantity
enter the conversion process. And at the end of it, a certain amount of unconverted heat would always be left over. It is the same phenomena of whatever the new cyberinfra product designed have a certain amount of unused or unexpected clogging to the end of it. A design pattern is against the next “best” out of scratch (Pillai, Pyykkonen & Salminen, 2009).
DESIGN KNOWLEDGE-INTENSITY AND CYBER-CHALLENGES
Product Design knowledge has reasonably been studied. They are based on which an immense number of modeling techniques have been developed. A large amount of them are tailored to precise products or specific uniqueness of the design activities. To fit an ideal semantic logic, the geometric modeling, is mainly used for supporting detailed designs. While knowledge modeling is working for supporting the conceptual designs, as we discuss this bit in detail later at this chapter. The National Institute of Standards and Technol- ogy (NIST) had set up a project that is based on the previous said techniques “a design repository.” The NIST team has attempted to model, three fundamental facets of an artifact representation, such as the physical layout of the artifact (form), an indication of the overall effect that the artifact creates (function), and a causal account of the operation of the artifact (behavior) (Szykman, Sriram, & Regli, 2001). The NIST (Jia, 2007) has recently made an effort via this study. It offers that the development of the basic foundations of the next generation of Computer-Aided Design- ing (CAD) systems, which would able to gain a core representation for design information, called the NIST Core Product Model (CPM) (Fenves, 2001). It has a set of derived models defined as extensions of the CPM (e.g. Zha & Sriram, 2004; Economist, 2012). The NIST core product model has been developed to unify and integrate product or assembly information. The CPM provides a
base-level product model that is not tied to any vendor software. It is meant to perform in an open, non-proprietary, expandable and independent of any one product development process. It is capable of capturing the engineering context, which is most commonly shared in product development activities. The mechanism is supposed to function on artifact representation that includes function, form, behavior, material, physical, and functional decompositions. It consistently inter- links the relationships among these concepts that are addressed previously. The entity-relationship data model influences the model a great deal. Accordingly, it consists of two sets of classes, called object and relationship, this is equivalent to the Unified Modeling Language (UML) class and association class, respectively. However, the Entity–Relationship-model (ER model) was developed by Chen (1976) in a software engineering pack, is an abstract way to describe a database. Chen’s diagram is seen here as Figure 4.
An entity may be defined as a thing, which is recognized as being capable of an independent existence and which can be uniquely identified. An entity is an abstraction from the complexities of a domain. When one speaks of an entity, we
normally speak of some aspect of the authentic world, which can be distinguished from other aspects of the real world (Beynon-Davies, 2004). An entity may be a physical object such as a house or a car, an event such as a house sale or a car service, or a concept such as a customer transaction or order. Although the term entity is the one most commonly used, following Chen we should really distinguish between an entity and an entity- type. An entity-type is a category. An entity, strictly speaking, is an instance of a given entity- type. There are usually many instances of an entity-type. Because the term entity-type is some- what cumbersome, most people tend to use the term entity as a synonym for this term. Entities can be thought of as nouns. This can be ruled out in examples such as a computer, an employee, a song, a mathematical theorem. They are matured to use from the sky, such as ontology connected configuration. While would extract the knowledge. The other main orientation toward artificial intelligence is the pattern-based approach—often called “connectionism” or “parallel distributed processing”—reemerged from the shadow of symbol processing only in the 1980s. In many cases, better knowledge can be more important
Figure 4. Simple model that connects relationship entities
for solving a task than better algorithms. To have truly intelligent systems, knowledge needs to be captured, processed, reused, and communicated. Ontology-based system would support all these tasks, says Salminen and Pillai (2007). They re- searched out the ontology phenomenon, which was just fossilized out at the Silicon Valley and Maryland, USA. While in NIST, one of the authors, an immense number of standardization agenda were hanging out in ontology. It is time to dip out here a bit deeper the application trends in the case of Internet and electronic manufacturing sector.
The term “ontology” would be defined. It is simply an explicit specification of conceptualization. Ontologies are able to capture the structure of the domain that is the conceptualization. This includes the model of the domain with possible restrictions. The conceptualization describes the knowledge about the domain, not about the particular state of affairs in the domain. In other words, the conceptualization is not changing, or is changing very rarely. Ontology then is the specification of this conceptualization. That is the conceptualization, which is specified by using a particular modeling language and particular terms. Formal specification is required, in order to be able to process ontologies and operate on ontologies automatically. Ontology describes a domain, whiles a knowledge-base (based on ontology), and describes particular state of af- fairs. Each knowledge based system or agent has its own knowledge base, and only what can be expressed using ontology can be stored and used in the knowledge-base. When an agent wants to communicate to another agent, it uses the constructs from some ontology. In order to understand the communication, ontologies must be shared between agents. Although it is required from ontology, to be formally defined, there is no common definition of the term “ontology” itself (Pillai, 2002). The definitions can be categorized into roughly three groups:
- Ontology is a term in philosophy and its meaning is “theory of existence”.
- Ontology is an explicit specification of concept
- Ontology is a body of knowledge describing some domain, typically common sense knowledge domain.
CYBERINFRA METHODOLOGY DEVELOPMENT
There is a great need for most of the businesses to develop new product and service in Open System Architecture. A sustainable growth of the business lies on services that Corporations are offering (Salminen & Pillai, 2005). The Smart Products, or say cyberinfra products, of today are increasingly embedding with intelligent. Therefore, the role of product is very important. The best possible product architecture for the optimized product platform thus needs a method. It is done by organizing and recognizing the product model- ing and managing the dependency matrix of the product. In some cases, it is to be re-engineered to achieve an optimum cash return.
The methodology phase contains here a freshly created new knowledge. It is attributing the redundancy or not. This phase draw ups the possibilities of self-integrating schemes. This includes the manufacturing system integrations and netting them into Business loop and thus creating B2B platform to serve B2C (Salminen & Pillai, 2007). Figure 5 introduces now the agent mechanisms for Internet based intelligent and electronic manufacturing.
Figure 6 describes the self-integration patterns by using semantic infrastructure.
Figure 7 explains the connectivity of Semantic Web into intelligent and electronic manufacturing concept. The methodology of this is outlined in Figures 7 and 8. Intelligent and Electronic Manufacturing System requires in building the
Figure 5. Agent-based mechanisms over internet browser for intelligent and electronic manufacturing
Figure 6. Self-integration environment
foundation that route the Semantic Web by “Plug- and-Play” format (Salminen & Pillai, 2007). The repository system feeds, the strong linguistic representation via on-line modeling and plugging them between the structures and classes. Internal system has sustained with a search engine that would simulate and fractionate via re-framing the engineering entities into an understandable semantic structure. The semantic infrastructure for Plug and Play Collaboration was created by the National Institute of Standards and Technology, NIST on their Test-bed facility and or so called information platform. Process Semantic Language, PSL, was also implemented on the test bed (Figure 8).
The time this methodology was introduced for applying at the Test-bed facility, it was too early. It is tempting to say that sheer speed will no lon- ger suffice, and that more knowledge of chess, or something else, is needed. What this sketch shows is that a state transition must be represented as a pair forward >< backward. The forward is the information about how to get to the state from the
Figure 7. Intelligent and electronic manufacturing system integration for semantic web
Figure 8. Methodology on intelligent and electronic manufacturing system integration for semantic web
immediately preceding state. The backward is the information about how to undo forward, that is how to go back to the immediately preceding state. A procedure was initiated through this sketch. Process specification language and the semantic impacts were tried to coin by adding a modeling tool. This tool models the context into machine understandable language. Thus is pinning a rep- resentation that adds value to this new standard. In practice, it has proven that it does not rule the goof. The purpose of this exercise was due to an evaluation of this methodology – while building the semantic infrastructure, where one leads to more functionality to PSL. However, it is learned that the representation needs to be evaluated before one could bring more functionality. In this case we were proposing a requirement analyzing and mapping tool called Optiwise®5. We have also experimented to see the impact on, while adding the RDF (Resource Description Format) with ontology, for bring representation task at the application layer. The result was not encouraging at all.
PERFORMANCE MODEL: SEMANTIC DEFINITION
Michael Faraday (1791-1867) said long time ago very cleverly that “Nothing is too wonderful to be true if it be consistent with the laws of nature, and in such things as these, experiment is the best test of such consistency” (wikiversity.org). When a request involved in any format should answer the end-to-end performance. The Semantic definition is to be clear from the request structure. There will be three structural classes, such as mechani- cal, process and controls. A database is linking automatically to a system when specified in the process. A process model is based on this statement spread-out in Figure 9 for easy representation.
Here the tag associated, is to mean that it belongs to structural identification. This should belong to either Mechanical, Process, or Control properties and are associated to service, product and or both. This is a theoretical model that is tested in practice. Its content would represent a portion of undefined but used at paper engineering as a tool kit. It is formularized and used for
Figure 9. Identification of the tags (mechanical, process, control)
common process automation. It is primarily based on public and private vocabularies of the predefined instances or industry.
CYBERINFRA PROCESS
The model presented in Figure 9 is further said to be little more than an academic exercise. As a result, there is no warehouse of engineering tool-kit descriptions that can be reached over time. The changes across in the terminologies used there to formalize a solution, to the common understanding through the Web. The engineering tool-kit describes the repositories that support such “time-travel,” without queuing is so in the same manner. And none uses the existing or proposed standards. An explicit goal of this project is to begin to overcome this shortfall at least in the context of engineering. The first step is in mak- ing the formal terminology that change in itself into a new terminology/ontology, the “thing,” or a “unit.” This new one is further on to create a unit of change that has the same general properties as
any other “thingness” unit. For example, when an appropriate reference of given taxonomies, which is used to (in the Structural Logic sense) “classify” an engineering notion. One can create a desired reference terminology—by adding the definition of each application, this would allow again a new application at a time. Frequently, new application comes with new mechanisms of action. Further- more, new indications (implementation) would set at and thus the corresponding “new thing” may need to update the reference taxonomies, before adding the new definition of the new thing and so on. To make this simple, there is one potential, which is closer to a term that is being updated. This new term now created, is not requiring update for to the reference. This is due to:
- New thing “publish,” as XML, a newly “structured” version of the package insert or, “label,” designed to “explain” that the everything to both human and com
- Do further processing and enhance the parts of the label that can be processed use- fully by computers and then “publish” it,
once again in XML. The “enhancements” may include connections to the mechanical, process and control engineering lit- erature, related to terminology and foreign language names.
- Applications or servers electing to process the new thing transaction will see that the XML indicates that is an “add,” the simplest kind of transaction to pr That is the transaction, which would add a new concept—the new thing, the appropriate relationships to other concepts influence on the various reference taxonomies, and attributes.
It is not hard to imagine that most applications would be relevant of such insertion and subsequently “do the right action.” However, the problem with this simple form of the new thing transaction is that as described by domain experts. Most new things represent “changes in understanding,” and it is not at all clear how existing applications would deal with such changes in understanding automatically, or know when they need help from human-interactions. In the context, “changes in understanding,” is represented by changes in the reference taxonomies.
CYBERINFRA: ONTOLOGY
We view a formal terminology or ontology as a cor- pus of “facts” or assertions. It is collected over time, and then one can contemplate ontology of such facts, or changes. The goal of such operation, is to evaluate and adapt semantic Web infrastructure and implement the ontologies for B2B (Business- to-Business) processes for engineering systems (Pillai & Salminen, 2004), and that is sighted at this reference. B2B of engineered products is very complex and has no foundation for easy sharing of product, process, or production information. The opportunity is a new semantic language by encoding meaning, which is being developed, or
already now existing for the Web. This is the ba- sis for B2B of engineered products and services. The difficulty is defining and implementing the semantics to be attached to each type of change unit. One step toward such semantics is the simple expedient of tagging each terminological unit— concept, term, relationship, and attribute—with a “start entity” and “end entity.” More disciplined and complete forms of such semantics are what needed to preserve the longitudinal functionality of systems. This uses the ontology, and what will be needed to transfer knowledge gained from a successful test of the new thing transaction to the Semantic Web (Salminen & Pillai, 2007). Therefore, even when the user interface returns an exact equivalent for the casual term, users may choose a “better” formal term from the displayed semantic neighborhood. The simple explanation of this phenomenon is that humans are better at recognition than recall. Those who are developing ontologies are familiar with the phenomenon, once domain experts can “see” a domain model, they can almost always make it better. Doume- ingts, G, et al, of the European Union (EU), for instance, proposed a draft to the CEN (European Committee for Standardization) framework, on Interoperability Schemes (Doumeingts, Li, Pid- dington, & Ruggaber, 2005), though not widely explored. Figure 10 shows the same approach as we explored it elsewhere in this chapter.
KNOWLEDGE-INTENSIVE SOLUTION-BASED CYBERINFRA DESIGN
We are now on the next step to reach out at intel- ligence as a whole, particularly the artificial side of it. It is precious to know as to how that works in this concept. In control theory (Jia, 2007), we do dwell with the system performance. As well as its functions and this would equally affect when create an engineering product either imaginary or physical. While designing an embedded system
Figure 10. CEN framework for enterprise interoperability6
often reside in machines. Those machines are expected to run continuously for years without errors, and in some cases, it should recover by itself, if an error occurs. Therefore, the software is developed and tested more cautiously. This is more vigorous than for personal computers. It is entirely obsolete when there are unreliable mechanical moving parts resting at, such as disk drives, switches or buttons. The old way of making things involved, taking lots of parts and screwing or welding them together. Now a product can be designed on a computer and “printed” at a 3D printer; which creates a solid object by building up successive layers of material. The digital design can be tweaked with a few mouse-clicks. A 3D printer can run unattended, and can make many things, which are too complex for a traditional factory to handle. The applications of 3D printing are especially mind-boggling as an exercise of the present day technology of this decade. Already, hearing aids and high-tech parts of military jets are being printed in customized shapes. The geography of supply chains will change or nor more exist. An engineer working in the middle of a wilderness, who finds that he lacks a certain tool, no longer has to have it delivered from the nearest city. He can simply download the design and print it. The
days when projects ground to a halt for want of a piece of kit; or when customers complained that they could no longer find spare parts for things they had bought, will one day seem quaint.
At this point, it is worth to note that on the design point of view, which the material used in manufacturing are lighter, stronger and more durable than the old ones. Let us look quickly at the Carbon fiber, which is replacing steel and aluminum in products ranging from aircrafts to mountain bikes. Right now new techniques, let engineers shape objects at a nano-scale. Nanotech- nology is giving products better features, such as bandages that help to cure cuts. Engines that run pretty efficiently and crockery that cleans more easily. In the Internet, it is allowing more and more designers to work in partnership on new products, the barriers to entry are falling.
CYBERINFRA DEFINITION FOR INVENTION
The current techniques are for totally custom design type of droplet-based “digital” biochips do not scale well for side-by-side entity or next- generation System-On-Chip (SOC) patterns.
These are expected to include in microfluidic components. Micro fluids and biochips are the new inventions. Cyberinfra is defined earlier where lonely visionaries would encounter with the comparable associations which are tent to be an invention in its nature. This is like the same principles as parties think of creating something new or being modified to fit the existing, or both. This is our intention. While each of these formulas says how much information is generated by the selection of a specific message, communication theory is seldom primarily interested in these measures.
Case: Cyber-Keksimö
Let us begin to think of Keksimö, a typical Finnish- language word, has an anew meaning of its use.
In this context, came to thinking about what the term “Keksimö”7 exactly means. We invented8 it! Recent past history taught us and inspired by Gabriel Tarden and Joseph Schumpeter, in their thought on the neoclassic economic theory. Ac- cording to them, the “innovative economy” would require pragmatic grounding efforts or put them into practice. “Innovative” expresses the term “innovation” or “innovare,” which means “to make a new.” Novelty value and importance that is always determined in relation to a specific user
base or format. The term “Inventio” comes from the word “invenire” on the other hand, the “come in, to find.” Therefore, inventions may not be recognized at first the “new.” This was also the case “Keksimö.” The Keksimö is close to the familiar words, such as “invent” and “invention” but in this sense that is as a place where discoveries are made and processed, the term is fresh and new. Keksimö, therefore, is an “invention,” because the term is “found” familiar to the existing natural logic and “introduced” or “passed” in to a new logical sense of application. This is so-called novel-era of use. It does not necessarily want to understand or barely stay unnoticed. The Keksimö has become a secondary question in relation to how much it affects the way we perceive, act, think, and discuss.
INVENTION PROCEDURES
The phenomenon of normal inventors9 are very dare some – How to, and where to begin; due to heavy bureaucratic set up in Europe. Figure 11 wraps up the true story of a Finn in Finland. He begins to think new ideas to realize as a product or service. In materializing, he is in tight track for something. He likes perhaps to solve some existing problems. He goes the following steps, as in Figure 11.
Figure 11. Inventor-investor conspiracy and state of confusion
Let us look at this in a generic term the performance, substantives, or activities of inventions in ones’ day-to-day life spectrum. The transition logic to this aspect could be drafted as in Figure 12.
CYBER-PLATFORM DESIGN: KISBED
A sporadic violence would be right at this point such as to bring in to share the fundamental concept of prototyping the KISBED platform for indulge to spurn (See Figure 13). The pique in a nascent form is diluted at a three step to trap. At the orchestrating front is every single grossularite, should have to divide their roles before in the defragmenting phase.
Let us now add-up this step to the orchestrating, which is the initial front at sight; the grossularite moves to a comforting segment where other activities of philharmonic would be set to timely perform. This Figure 13 faces the synchronizing phase in Figure 14. Notwithstanding, the KISBED method describes in several steps that are inside the camera of functions and the initial steps of “invention” meets the most sturdy bricks before passing the level to an entrepreneurship.
USE CASE: KNOWLEDGE INTENSIVE CYBERINFRA PLATFORM
We known and as well reported (Economist, 2012) that the first industrial revolution began in Great Britain at the late 18th century with the mechanization of the textile industry. Tasks up to that time done laboriously by hand in hundreds of weavers’ cottages were brought together in a single cotton mill, and the factory was born. The second industrial revolution bumped up in the early 20th century, when Henry Ford mastered the moving assembly line at manufacturing and boosted to the age of mass production. The first two industrial revolutions made people richer and more urban. Now a third revolution is under way. The mechatronics represents a unifying en- gineering science paradigm. This science sector is bound to an interdisciplinary knowledge area that addresses interactions, in terms of the ways of it is working and thinking, and is able to spin together the practical experiences that are being extracted from the theoretical knowledge (Habib, 2008). The physical, solid and the entire manu- facturing scheme are going digital. An amazing number of high technologies are converging. The system is in contriving with intelligent software,
Figure 12. Invention phases: generic setup
Figure 13. Paradigm of KISBED method
Figure 14. Synchronized phases few elements at the KISBED
novel materials, more dexterous robots with new processes (including the three-dimensional print- ing) and a whole range of Web-based or enabled services. The factory of the past was based on cranking out zillions of identical products. Ford famously said that car-buyers could have any color they liked, as long as it was black. But the cost
of producing much smaller batches of a wider variety with each product tailored precisely to each customer’s whims is falling. The factory of the future will focus on mass customization—and may look more like those weavers’ cottages than Ford’s assembly line.
Haves (2013) explained the computation problems so well in his one of the articles at the American Scientists (Haves, 2013) those roughly 30 years ago, Harold J. Morowitz, who was then at the Yale, set forth a bold plan for molecular biology (Barile & Razin, 1989). Morowitz was outlined a campaign to study one of the smallest single-celled organisms, a bacterium of the genus Mycoplasma. The first step was to be to decipher its complete genetic sequence, which in turn would reveal the amino acid sequences of all the proteins in the cell. In the 1980s reading an entire genome was not the routine task as it is today, though Morowitz argued that the analysis should be possible if the genome was small enough. He calculated the information content of mycoplasma DNA to be about 160,000 bits, and then added on the other hand, this much DNA will code for about 600 proteins—which suggested that the logic of life can be written in 600 steps. Completely understanding the operations of a prokaryotic cell is a possible concept that can be visualized and one that is within the range of the possibil- ity. There was one more fascinating element to Morowitz’s plan at 600 steps, a computer model become feasible. And every experiment that can be carried out in the laboratory can also be carried out on the computer. The extent to which, these match measures the completeness of the paradigm of molecular biology in the past of our modern scientific history.
Today, when one is looking back on these proposals, the genomics and proteomics, there is no doubt that Morowitz was right about the viability of collecting sequence data (Bonarius, Schmid, & Tramper, 1997). On the other hand, the challenges of writing down “the logic of life” in 600 steps and “completely understanding” a living cell still look fairly daunting. And what about the computer program that would simulate a living cell well enough to match the experiments that was carried out on real organisms? A computer program with exactly that goal was published last summer by Covert of Stanford University with his coworkers (Brenner, 2010). The program, called
the WholeCell10 simulation that describes the full life cycle of Mycoplasma genitalium, which is a bacterium from the genus that Morowitz was suggested. The model included, are all the major processes of life. The transcription is of DNA into RNA. A translation is of RNA into protein, metabolism of nutrients to produce energy. Their structural constituents have replication of the genome, and ultimately its reproduction by cell fission. The outputs of the simulation do seem to match experimental results (Brenner, 2010).
While considering the immense work of bright people out there, this niche approach seems to be genuine in three ways in adapting at a scenario of natural life spectrum. Initially we know already that there are billions of “ones-and-zeros” at our cyberspace. Now we have the problem to store them, process or re-use. This is the first way. Second way, the zeros-and-ones have some mean- ingful kernel when they are on transition. Third way, is the ontology adaption to understand and translate the meaning to the next phase of orderly use that we might process in Clouds. The Clouds technology we refer here is part of our founda- tion where we have the Knowledge-Intensive cyber-platform – an improved one. We fit this next generation technology here lively (meaning the improved and advanced user capability). The fitting would be on the form of an Application Programming Interface (API). This API would affect in cost reduction and it is also device or location independent with multi-tenancy. In an- other words, it would be sharing the resources. The centralization infrastructure would be an additional feature in which the system would be steady in controlling the peak-load capacity, and would add efficiency. Count on other features of the system such as that is scalable, would have enforced or remarkable security. It would be easy to maintain and measure the system in anytime, anywhere. It is also assured that the system avail- able whenever needs. Figure 15 is shown here as a solution, which is apparently very close to the patent-pending11 concept.
Figure 15. KISBED: a global approach where it plays
Figure 15 is a generic approach of the Knowl- edge-intensive platform (Pillai, Pyykkonen, & Salminen, 2009). The peer’s active roles are plot- ted better in Figure 16.
This platform is based on an embedded system with a strong algorithm where the communication would simply be implemented in ZigBee, Wi-Fi, UWB. It is possible also to run any latest protocols that are standard. However, it is now assumed that it would be specific to the normal frequency level between 486 MHz and 2.6 GHz (Covert, Xiao, Chen, & Karr, 2008). On the other hand, this is not the final specification to the platform. When application-specific systems are to be implemented then said points above, are to be
Figure 16. Communication at the grid-cyber- semantic clouds
treated only as guidelines. Further investigation and or implementation strategies are drawn as in Figure 17.
The knowledge-intensive methodology is a technical invention, which was plotted in the year 2000. This came out as a series of researches that exceed the ambition and expectation of the team partially in the Silicon Valley, USA, and the rest in Finland. The motivation resulted at the end of the day with a final touch that drawn the work as an international patent concept. This patent is now pending in many countries. However, the Finnish Patent authority has approved this as patent in December 2009. This patent was instrumental in digging deeper to find evidence with several academic researches for implementing into real world. Those studies were led to light in diversified application protocols. Methodology of KISBED is very complex, though mathematically it seems to be almost perfect when one may see the trend in Clouds and Semantic-cyber-grids. Pipe-line structure has its root where most of the works with and without database; virtually anywhere and everywhere. One of the applications is tagged and that is shown in Figure 18.
CASE FORMULA 1: KISBED APPLICATION SCENARIO
One of the test-bed implementations shows the competence of KISBED-cyberinfra method here in real time motion feature at the Formula 1 car-race competition. Personalized experience to each race while you are seeing the ground actions through your Personal Digital Assistant (PDA). One could reach and feel the performances of the Formula 1 driver on the fly. We had experimented with joy this with Nokia 800 Tablet. The Viewer would have a number of options, while on motion – the driver, Car speed, built-in-cameras or simply associated advertisements that are randomly provided at the net. Alongside the video stream, one would be able to browse through their favorite driver’s statistics
Figure 17. A similar approach as this one for test-bed in manufacturing for industrial applications
Figure 18. The experience system architecture for formula 1
Figure 19. Personalized experience while formula 1 driver is in action at his race track
and earlier performances at the same time, while watching the race. Only the limit is the imagination of the viewer with his or her blended lifetime experiences at the race-area in-time. All what one need to a simple device that is operated at the wireless or cyberinfrastructure stadium and one is absorbing the entire “happening” in his or her chair where located—near and far—though this is true to believe. Figure 19 shows the real-time performance with KISBED tool in Nokia’s Tablet.
SUMMARY OF CYBERINFRA PRODUCT STRATEGY
In this chapter, we are successful in offering a realistic groundwork. We have designed and implemented the cyberinfra product, and we presented here a concept. We were able to suggest that the interoperability schemes functions when applied at the real industrial environment. Today, the concept is almost ready and behaves right for the future. We have also specified the requirements and its challenges. We have introduced here also a very modern concept with various views of it. We have integrated all but possible elements to have an affordable pack. The entire system would be needed to run the concept creation and prototyping, perhaps in an open semantic infrastructure. Figure 1, as a system model in the beginning, is showing the technology roadmap with the steps to reach the vision on the top. To create a cyperinfra product with it, new concepts while engaging the interoperability scheme and plug and play capability is sometime very bold. Manufacturing industry has many numbers of stumbling blocks which are not only in developing systems and software, but also for a smart environment that is a daunting task. There is sensor hardware and software perceiving the environment. The application software that interprets and reasons about perception data and the effecter control the software acting on the environment. As well as many support systems that make the challenges to standards in posting them to Semantic Infrastructure or per se Semantic Web.
This software is commonly called “middleware.” It lies between the software applications it assists and the platform it is based on. Middleware clas- sically resides in a layer which is built directly on other layers of middleware. This characteristically is forming the higher abstractions with each other layer. Middleware must be designed by the API (Application Programming Interface). And it is providing the applications with the protocol(s) it supports (Bernstein, 1996). At the end of the chapter, we have introduced case studies of how this concept creation and prototyping functionality is working. One of the case studies is also verified the scenario of KISBED applications.
In concluding, we do say and see that there are a number of approaches and paradigm combinations to explore the Semantic Web Infrastructure. We have used here a basic perceive-reason-act artificial intelligence approach (Russell & Norvig, 1995), and this was also meant as a building-block to categorize a distinctive “set of needs.” Additionally, it is also linking up “some wants,” and it tries to address the desirable uniqueness. One might regard this in future studies.
REFERENCES
Ansys. (n.d.). Introduction. Retrieved from www. ansys.com
Barile, M. F., & Razin, S. (Eds.). (1989). The mycoplasmas. New York: Academic Press.
Bernstein, P. A. (1996). Middleware: A model for distributed system services. Communications of the ACM, 39(2), 86–98. doi:10.1145/230798.230809.
Beynon-Davies, P. (2004). Database systems. Basingstoke, UK: Palgrave.
Bonarius, H. P. J., Schmid, G., & Tramper, J. (1997). Flux analysis of underdetermined meta- bolic networks: The quest for the missing con- straints. Trends in Biotechnology, 15, 308–314. doi:10.1016/S0167-7799(97)01067-6.
Brenner, S. (2010). Sequences and consequences. Philosophical Transactions of the Royal So- ciety of London, 365, 207–212. doi:10.1098/ rstb.2009.0221 PMID:20008397.
CE-NET. (2010). Concurrent enterprising network of excellence. Concurrent Engineering Roadmap Vision 2010. Retrieved from http://www.ce-net.org
Chandrasekharan, B., Josephson, J. R., & Ben- jamins, V. R. (1999). What are ontologies, and why do we need them? IEEE Intelligent Systems, 14(1), 20–26. doi:10.1109/5254.747902.
Chen, P.-S. (1976). A sample entity – relationship diagram. Retrieved from www.isu.edu
Covert, M. W., Xiao, N., Chen, T. J., & Karr, J.
- (2008). Integrating metabolic, transcriptional regulatory and signal transduction models. Esch- erichia coli. Bioinformatics (Oxford, England), 24, 2044–2050. doi:10.1093/bioinformatics/btn352 PMID:18621757.
Doumeingts, G., Li, M.-S., Piddington, C., & Ruggaber, R. (2005). Enterprise interoperability
- Research roadmap document.
Economist. (2012, April 21), Third industrial revolution. The Economist.
Fenves, S. J. (2001). A core product model for representing design information. Gaithersburg, MD: NIST.
Goodenough. (n.d.). Principles of thermodynam- ics. In Diesel Engineering Handbook (5th ed.). New York: Diesel Publication, Inc..
Habib, M. K. (2008). Interdisciplinary mecha- tronics: Problem solving, creative thinking and concurrent design synergy. International Journal of Mechatronics and Manufacturing Systems, 1(1), 264–269. doi:10.1504/IJMMS.2008.018272.
Haves, B. (2013). Imitation of life. American Scientist, 10–15.
Hawking, S. (2002). On the shoulders of giants
- The great works of physics and astronomy. London: Penguin Book
Jia, Y. M. (2007). Robust h control. Retrieved from www.sciencep.com
Pallot, M., Salminen, V., Pillai, B., & Kulvant,
- P. (2004). Business semantics: The magic in- strument enabling plug & play collaboration? In Proceedings of ICE 2004. Berners-Lee, T., Hendler, J., & Lassila, O. (2001). The semantic web. Scientific American.
Pillai, B. (2002). Process control and non-linear system using ontology. Gaithersburg, MD: NIST.
Pillai, B. (2011). Energy saving solutions at the wireless environment. Paper presented at the Automation Systems 2011. Helsinki, Finland.
Pillai, B. (2012). Bio-sensor implant for human body – Technology solutions. Paper presented at the 8th International Conference on Humanized Systems. Daejeon, South Korea.
Pillai, B., Pyykkönen, M., & Salminen, V. (2009). Knowledge-intensive arrangement of scattered data. Patent No. 120639 dated December 31, 2009. Helsinki, Finland: Finnish Patent Office.
Pillai, B., & Salminen, V. (2004). Trends in design and connectivity to broadband. Paper presented at the 14th CIRP Design 2004. Cairo, Egypt.
Russell, S. J., & Norvig, P. (1995). Artificial intelligence: A modern approach. Upper Saddle River, NJ: Prentice Hall.
Salminen, V., & Pillai, B. (2004). Methodology on product life cycle challenge management for virtual enterprises. In Camarinho-Matos, L. M., & Afsarmanesh, H. (Eds.), Processes and Founda- tions for Virtual Organizations. Dordrecht, The Netherlands: Kluver Academic Publishers.
Salminen, V., & Pillai, B. (2005). Integration of products and services – Towards system and performance. Paper presented at the International Conference on Engineering Design. Melbourne, Australia.
Salminen, V., & Pillai, B. (2007). Interoper- ability requirement challenges – Future trends. Paper presented at the International Symposium on Collaborative Technologies and Systems. Orlando, FL.
Szykman, S., Sriram, R. D., & Regli, W. (2001). The role of knowledge in next generation product development systems. ASME Journal of Comput- ing and Information Science in Engineering, 1(1), 3–11. doi:10.1115/1.1344238.
Zha, X. F., & Sriram, S. D. (2004). Feature-based component model for design of embedded system. In Gopalakrishnan, B. (Ed.), Intelligent Systems in Design and Manufacturing, Proceedings of SPIE (Vol. 5605, pp. 226–237). Bellingham, WA: SPIE. doi:10.1117/12.571612.
ENDNOTES
- The names of this view include “Hempel’s model”, “Hempel-Oppenheim (HO) Model,” “Popper-Hempel Model,” “Deductive- Nomological (D-N) Model,” and the “Sub- sumption Theory.”
- Physical products vacuum cleaners, fridges, cars, aircrafts, turbine engines, e There are things that are distributed via Internet
also called as products. In other words, one may say that Microsoft’s or Nokia’s Internet products other than cell phones are not always physical.
- NIST: National Institute of Science and Technology, Department of Commerce, USA.
- ANSYS is a system provider and designing team in the US.
- Optiwise is a system owned by Real Time Systems Inc., Sunnyvale, CA, USA.
- Adopted courtesy of the Network Enabled Abilities (CEN) CWA draft 9 – N 27 Research Roadmap Working Gr
- Keksimö is Finnish, and it means “the Inven- tors’ factory,” where new “things” are in- vented and/or created from This name or term “Keksimö” was first used in practice by Balan Pillai, Matti Pyykkönen, and Vesa Salminen in their patent, which is approved by the Finish Government Patent Authority with the patent number: 120639 and the World patent No: PCT/FI2006/050479.
- The Inventors are Balan Pillai, Matti Pyyk- könen, and Vesa All are from Finland.
- There is no one called a normal inventor, or explicitly, there is one who is norma We are all extraordinary individuals, and we exist only once on this planet.
- www.wholecell.ed.
- An international patent is pending for Pillai, Pyykkönen, and In Finland it is approved, and the patent number is 120639.
Due to graphics insert application default, it is seen that there are missing figures and other graphics attached to the article.