Back

Computer Architecture and Correlithm Object Processing


Drawings

Brief Description:

Figure 1 is a schematic view of an embodiment of a special purpose computerimplementingcorrelithm objects in an n-dimensional space

Detailed Description:

Figure 1 is a schematic view of an embodiment of a user device 112 implementingcorrelithm objects 104 in an n-dimensional space 102. Examples of user devices 112 include, but are not limited to, desktop computers, mobile phones, tablet computers, laptop computers, or other special purpose computer platforms. The user device 112 is configured to implement or emulate a correlithm object processing system that uses categorical numbers to represent data samples as correlithm objects 104 in a high-dimensional space 102, for example a high-dimensional binary cube. Additional information about the correlithm object processing system is described in Figure 3. Additional information about configuring the user device 112 to implement or emulate a correlithm object processing system is described in Figure 5

 

Conventional computers rely on the numerical order of ordinal binary integers representing data to perform various operations such as counting, sorting, indexing, and mathematical calculations. Even when performing operations that involve other number systems (e.g. floating point), conventional computers still resort to using ordinal binary integers to perform any operations. Ordinal based number systems only provide information about the sequence order of the numbers themselves based on their numeric values. Ordinal numbers do not provide any information about any other types of relationships for the data being represented by the numeric values, such as similarity. For example, when a conventional computer uses ordinal numbers to represent data samples (e.g. images or audio signals), different data samples are represented by different numeric values. The different numeric values do not provide any information about how similar or dissimilar one data sample is from another. In other words, conventional computers are only able to make binary comparisons of data samples which only results in determining whether the data samplesmatch or do not match. Unless there is an exact match in ordinal number values, conventional systems are unable to tell if a data samplematches or is similar to any other data samples. As a result, conventional computers are unable to use ordinal numbers by themselves for determining similarity between different data samples, and instead these computers rely on complex signal processing techniques. Determining whether a data samplematches or is similar to other data samples is not a trivial task and poses several technical challenges for conventional computers. These technical challengesresult in complex processes that consume processing power which reduces the speed and performance of the system

 

In contrast to conventional systems, the user device 112 operates as a special purpose machine for implementing or emulating a correlithm object processing system. Implementing or emulating a correlithm object processing system improves the operation of the user device 112 by enabling the user device 112 to perform non-binary comparisons (i.e. match or no match) between different data samples. This enables the user device 112 to quantify a degree of similarity between different data samples. This increases the flexibility of the user device 112 to work with data samples having different data types and/or formats, and also increases the speed and performance of the user device 112 when performing operations using data samples. These improvements and other benefits to the user device 112 are described in more detail below and throughout the disclosure

 

For example, the user device 112 employs the correlithm object processing system to allow the user device 112 to compare data samples even when the input data sample does not exactly match any known or previously stored input values. Implementing a correlithm object processing system fundamentally changes the user device 112 and the traditional data processing paradigm. Implementing the correlithm object processing system improves the operation of the user device 112 by enabling the user device 112 to perform non-binary comparisons of data samples. In other words, the user device 112 is able to determine how similar the data samples are to each other even when the data samples are not exact matches. In addition, the user device 112 is able to quantify how similar data samples are to one another. The ability to determine how similar data samples are to each others is unique and distinct from conventional computers that can only perform binary comparisons to identify exact matches

 

The user device’s 112 ability to perform non-binary comparisons of data samples also fundamentally changes traditional data searching paradigms. For example, conventional search engines rely on finding exact matches or exact partial matches of search tokens to identify related data samples. For instance, conventional text-based search engine are limited to finding related data samples that have text that exactly matchesother data samples. These search engines only provide a binary result that identifies whether or not an exact match was found based on the search token. Implementing the correlithm object processing system improves the operation of the user device 112 by enabling the user device 112 to identify related data samples based on how similar the search token is to other data sample. These improvementsresult in increased flexibility and faster search time when using a correlithm object processing system. The ability to identify similarities between data samples expands the capabilities of a search engine to include data samples that may not have an exact match with a search token but are still related and similar in some aspects. The user device 112 is also able to quantify how similar data samples are to each other based on characteristics besides exact matches to the search token. Implementing the correlithm object processing system involves operating the user device 112 in an unconventional manner to achieve these technological improvements as well as other benefits described below for the user device 112. 

 

Computing devices typically rely on the ability to compare data sets (e.g. data samples) to one another for processing. For example, in security or authentication applications a computing device is configured to compare an input of an unknown person to a data set of known people (or biometric information associated with these people). The problems associated with comparing data sets and identifying matches based on the comparison are problems necessarily rooted in computer technologies. As described above, conventional systems are limited to a binary comparison that can only determine whether an exact match is found. As an example, an input data sample that is an image of a person may have different lighting conditions than previously stored images. In this example, different lighting conditions can make images of the same person appear different from each other. Conventional computers are unable to distinguish between two images of the same person with different lighting conditions and two images of two different people without complicated signal processing. In both of these cases, conventional computers can only determine that the images are different. This is because conventional computers rely on manipulating ordinal numbers for processing

 

In contrast, the user device 112 uses an unconventional configuration that uses correlithm objects to represent data samples. Using correlithm objects to represent data samples fundamentally changes the operation of the user device 112 and how the device views data samples. By implementing a correlithm object processing system, the user device 112 can determine the distance between the data samples and other known data samples to determine whether the input data sample matches or is similar to the other known data samples, as explained in detail below. Unlike the conventional computers described in the previous example, the user device 112 is able to distinguish between two images of the same person with different lighting conditions and two images of two different people by using correlithm objects 104. Correlithm objects allow the user device 112 to determine whether there are any similarities between data samples, such as between two images that are different from each other in some respects but similar in other respects. For example, the user device 112 is able to determine that despite different lighting conditions, the same person is present in both images

 

In addition, the user device 112 is able to determine a degree of similarity that quantifies how similar different data samples are to one another. Implementing a correlithm object processing system in the user device 112 improves the operation of the user device 112 when comparing data sets and identifying matches by allowing the user device 112 to perform non-binary comparisons between data sets and to quantify the similarity between different data samples. In addition, using a correlithm object processing system results in increased flexibility and faster search times when comparing data samples or data sets. Thus, implementing a correlithm object processing system in the user device 112 provides a technical solution to a problem necessarily rooted in computer technologies

 

The ability to implement a correlithm object processing system provides a technical advantage by allowing the system to identify and compare data samples regardless of whether an exact match has been previous observed or stored. In other words, using the correlithm object processing system the user device 112 is able to identify similar data samples to an input data sample in the absence of an exact match. This functionality is unique and distinct from conventional computers that can only identify data samples with exact matches

 

Examples of data samples include, but are not limited to, images, files, text, audio signals, biometric signals, electric signals, or any other suitable type of data. A correlithm object 104 is a point in the n-dimensional space 102, sometimes called a “n-space.” The value of represents the number of dimensions of the space. For example, an n-dimensional space 102 may be a 3-dimensional space, a 50-dimensional space, a 112-dimensional space, or any other suitable dimension space. The number of dimensions depends on its ability to support certain statistical tests, such as the distances between pairs of randomly chosen points in the space approximating a normal distribution. In some embodiments, increasing the number of dimensions in the n-dimensional space 102 modifies the statistical properties of the system to provide improved results. Increasing the number of dimensions increases the probability that a correlithm object 104 is similar to other adjacent correlithm objects 104. In other words, increasing the number of dimensions increases the correlation between how close a pair of correlithm objects 104 are to each other and how similar the correlithm objects 104 are to each other. 

 

Correlithm objectprocessing systems use new types of data structures called correlithm objects 104 that improve the way a device operates, for example, by enabling the device to perform non-binary data set comparisons and to quantify the similarity between different data samples. Correlithm objects 104 are data structures designed to improve the way a device stores, retrieves, and compares data samples in memory. Unlike conventional data structures, correlithm objects 104 are data structures where objects can be expressed in a high-dimensional space such that distance 106 between points in the space represent the similarity between different objects or data samples. In other words, the distance 106 between a pair of correlithm objects 104 in the n-dimensional space 102 indicates how similar the correlithm objects 104 are from each other and the data samples they represent. Correlithm objects 104 that are close to each other are more similar to each other than correlithm objects 104 that are further apart from each other. For example, in a facial recognition application, correlithm objects 104 used to represent images of different types of glasses may be relatively close to each other compared to correlithm objects 104 used to represent images of other features such as facial hair. An exact match between two data samples occurs when their corresponding correlithm objects 104 are the same or have no distance between them. When two data samples are not exact matches but are similar, the distance between their correlithm objects 104 can be used to indicate their similarities. In other words, the distance 106 between correlithm objects 104 can be used to identify both data samples that exactly match each other as well as data samples that do not match but are similar. This feature is unique to a correlithm processing system and is unlike conventional computers that are unable to detect when data samples are different but similar in some aspects

 

Correlithm objects 104 also provide a data structure that is independent of the data type and format of the data samples they represent. Correlithm objects 104 allow data samples to be directly compared regardless of their original data type and/or format. In some instances, comparing data samples as correlithm objects 104 is computationally more efficient and faster than comparing data samples in their original format. For example, comparing images using conventional data structures involves significant amounts of image processing which is time consuming and consumes processing resources. Thus, using correlithm objects 104 to represent data samples provides increased flexibility and improved performance compared to using other conventional data structures

 

In one embodiment, correlithm objects 104 may be represented using categorical binary strings. The number of bits used to represent the correlithm object 104 corresponds with the number of dimensions of the n-dimensional space 102 where the correlithm object 102 is located. For example, each correlithm object 104 may be uniquely identified using a 64-bit string in a 64-dimensional space 102. As another example, each correlithm object 104 may be uniquely identified using a 10-bit string in a 10-dimensional space 102. In other examples, correlithm objects 104 can be identified using any other suitable number of bits in a string that corresponds with the number of dimensions in the n-dimensional space 102

 

In this configuration, the distance 106 between two correlithm objectscorrelithm objects 104 can be determined based on the differences between the bits of the two correlithm objectscorrelithm objects 104. In other words, the distance 106 between two correlithm objects can be determined based on how many individual bits differ between the correlithm objects 104. The distance 106 between two correlithm objectscorrelithm objects 104 can be computed using hamming distance or any other suitable technique

 

As an example using a 10-dimensional space 102, a first correlithm object 104 is represented by a first 10-bit string (1121011011) and a second correlithm object 104 is represented by a second 10-bit string (1120011011). The hamming distance corresponds with the number of bits that differ between the first correlithm object 104 and the second correlithm object 104. In other words, the hamming distance between the first correlithm object 104 and the second correlithm object 104 can be computed as follows

 

##EQU00001## In this example, the hamming distance is equal to one because only one bit differs between the first correlithm object 104 and the second correlithm object. As another example, a third correlithm object 104 is represented by a third 10-bit string (0110112112). In this example, the hamming distance between the first correlithm object 104 and the third correlithm object 104 can be computed as follows

 

##EQU00002## The hamming distance is equal to ten because all of the bits are different between the first correlithm object 104 and the third correlithm object 104. In the previous example, a hamming distance equal to one indicates that the first correlithm object 104 and the second correlithm object 104 are close to each other in the n-dimensional space 102, which means they are similar to each other. In the second example, a hamming distance equal to ten indicates that the first correlithm object 104 and the third correlithm object 104 are further from each other in the n-dimensional space 102 and are less similar to each other than the first correlithm object 104 and the second correlithm object 104. In other words, the similarity between a pair of correlithm objects can be readily determined based on the distance between the pair correlithm objects

 

As another example, the distance between a pair of correlithm objects 104 can be determined by performing an XOR operation between the pair of correlithm objects 104 and counting the number of logical high values in the binary string. The number of logical high values indicates the number of bits that are different between the pair of correlithm objects 104 which also corresponds with the hamming distance between the pair of correlithm objects 104

 

In another embodiment, the distance 106 between two correlithm objectscorrelithm objects 104 can be determined using a minkowski distance such as the Euclidean or “straight-line” distance between the correlithm objects 104. For example, the distance 106 between a pair of correlithm objects 104 may be determined by calculating the square root of the sum of squares of the coordinate difference in each dimension

 

The user device 112 is configured to implement or emulate a correlithm object processing system that comprises one or more sensors108, nodes 304, and/or actors 110 in order to convert data samples between real world values or representations and to correlithm objects 104 in a correlithm object domain. Sensors108 are generally configured to convert real world data samples to the correlithm object domain. Nodes 304 are generally configured to process or perform various operations on correlithm objects in the correlithm object domain. Actors 110 are generally configured to convert correlithm objects 104 into real world values or representations. Additional information about sensors108, nodes 304, and actors 110 is described in Figure 3

 

Performing operations using correlithm objects 104 in a correlithm object domain allows the user device 112 to identify relationships between data samples that cannot be identified using conventional data processing systems. For example, in the correlithm object domain, the user device 112 is able to identify not only data samples that exactly match an input data sample, but also other data samples that have similar characteristics or features as the input data samples. Conventional computers are unable to identify these types of relationships readily. Using correlithm objects 104 improves the operation of the user device 112 by enabling the user device 112 to efficiently processdata samples and identify relationships between data samples without relying on signal processing techniques that require a significant amount of processing resources. These benefits allow the user device 112 to operate more efficiently than conventional computers by reducing the amount of processing power and resources that are needed to perform various operations

Brief Description:

Figure 2 is a perspective view of an embodiment of a mapping between correlithm objects in different n-dimensional spaces

Detailed Description:

Figure 2 is a schematic view of an embodiment of a mapping between correlithm objects 104 in different n-dimensional spaces 102. When implementing a correlithm object processing system, the user device 112 performs operations within the correlithm object domain using correlithm objects 104 in different n-dimensional spaces 102. As an example, the user device 112 may convert different types of data samples having real world values into correlithm objects 104 in different n-dimensional spaces 102. For instance, the user device 112 may convert data samples of text into a first set of correlithm objects 104 in a first n-dimensional space 102 and data samples of audio samples as a second set of correlithm objects 104 in a second n-dimensional space 102. Conventional systems require data samples to be of the same type and/or format in order to perform any kind of operation on the data samples. In some instances, some types of data samples cannot be compared because there is no common format available. For example, conventional computers are unable to compare data samples of images and data samples of audio samples because there is no common format. In contrast, the user device 112 implementing a correlithm object processing system is able to compare and perform operations using correlithm objects 104 in the correlithm object domain regardless of the type or format of the original data samples

 

In Figure 2, a first set of correlithm objects 204 are defined within a first n-dimensional space 212 and a second set of correlithm objects 208 are defined within a second n-dimensional space 210. The n-dimensional spaces may have the same number dimensions or a different number of dimensions. For example, the first n-dimensional space 212 and the second n-dimensional space 210 may both be three dimensional spaces. As another example, the first n-dimensional space 212 may be a three dimensional space and the second n-dimensional space 210 may be a nine dimensional space. Correlithm objects 104 in the first n-dimensional space 212 and second n-dimensional space 210 are mapped to each other. In other words, a correlithm object 204 in the first n-dimensional space 212 may reference or be linked with a particular correlithm object 208 in the second n-dimensional space 210. The correlithm objects 104 may also be linked with and referenced with other correlithm objects 104 in other n-dimensional spaces 102. 

 

In one embodiment, a data structure such as table 200 may be used to map or linkcorrelithm objects 194 in different n-dimensional spaces 102. In some instances, table 200 is referred to as a node table. Table 200 is generally configured to identify a first plurality of correlithm objects 104 in a first n-dimensional space 102 and a second plurality of correlithm objects 104 in a second n-dimensional space 102. Each correlithm object 104 in the first n-dimensional space 102 is linked with a correlithm object 104 is the second n-dimensional space 102. For example, table 200 may be configured with a first column 202 that lists correlithm objects 204 as source correlithm objects and a second column 204 that lists corresponding correlithm objects 208 as target correlithm objects. In other examples, table 200 may be configured in any other suitable manner or may be implemented using any other suitable data structure. In some embodiments, one or more mapping functions may be used to convert between a correlithm object 104 in a first n-dimensional space and a correlithm object 104 is a second n-dimensional space

Brief Description:

Figure 3 is a schematic view of an embodiment of a correlithm object processing system;

Detailed Description:

Figure 3 is a schematic view of an embodiment of a correlithm object processing system 300 that is implemented by a user device 112 to perform operations using correlithm objects 104. The system 300 generally comprises a sensor108, a node 304, and an actor 110. The system 300 may be configured with any suitable number and/or configuration of sensors108, nodes 304, and actors 110. An example of the system 300 in operation is described in Figure 4. In one embodiment, a sensor108, a node 304, and an actor 110 may all be implemented on the same device (e.g. user device 112). In other embodiments, a sensor108, a node 304, and an actor 110 may each be implemented on different devices in signal communication with each other for example over a network. In other embodiments, different devices may be configured to implement any combination of sensors108, nodes 304, and actors 110. 

 

Sensors108 serve as interfaces that allow a user device 112 to convert real world data samples into correlithm objects 104 that can be used in the correlithm object domain. Sensors108 enable the user device 112 compare and perform operations using correlithm objects 104 regardless of the data typetype or format of the original data sample. Sensors108 are configured to receive a real world value 320 representing a data sample as an input, to determine a correlithm object 104 based on the real world value 320, and to output the correlithm object 104. For example, the sensor108 may receive an image324 of a person and output a correlithm object 322 to the node 304or actor 110. In one embodiment, sensors108 are configured to use sensor tables 308 that link a plurality of real world values with a plurality of correlithm objects 104 in an n-dimensional space 102. Real world values are any type of signal, value, or representation of data samples. Examples of real world values include, but are not limited to, images, pixel values, text, audio signals, electrical signals, and biometric signals. As an example, a sensor table 308 may be configured with a first column 312 that lists real world value entries corresponding with different images and a second column 314 that lists corresponding correlithm objects 104 as input correlithm objects. In other examples, sensor tables 308 may be configured in any other suitable manner or may be implemented using any other suitable data structure. In some embodiments, one or more mapping functions may be used to translate between a real world value 320 and a correlithm object 104 is an n-dimensional space 102. Additional information for implementing or emulating a sensor108 in hardware is described in Figure 5

 

Nodes 304 are configured to receive a correlithm object 104 (e.g. an input correlithm object 314), to determine another correlithm object 104 based on the received correlithm object 104, and to output the identified correlithm object 104 (e.g. an output correlithm object 316). In one embodiment, nodes 304 are configured to use node tables 200 that link a plurality of correlithm objects 104 from a first n-dimensional space 102 with a plurality of correlithm objects 104 in a second n-dimensional space 102. A node table 200 may be configured similar to the table 200 described in Figure 2. Additional information for implementing or emulating a node 304 in hardware is described in Figure 5

 

Actors 110 serve as interfaces that allow a user device 112 to convert correlithm objects 104 in the correlithm object domain back to real world values or data samples. Actors 110 enable the user device 112 to convert from correlithm objects 104 into any suitable type of real world value. Actors 110 are configured to receive a correlithm object 104 (e.g. an output correlithm object 316), to determine a real world output value 322 based on the received correlithm object 104, and to output the real world output value 322. The real world output value 322 may be a different data type or representation of the original data sample. As an example, the real world input value 320 may be an image324 of a person and the resulting real world output value 322 may be text 326 and/or an audio signal identifying the person. In one embodiment, actors 110 are configured to use actor tables 310 that link a plurality of correlithm objects 104 in an n-dimensional space 102 with a plurality of real world values. As an example, an actor table 310 may be configured with a first column 316 that lists correlithm objects 104 as output correlithm objects and a second column 318 that lists real world values. In other examples, actor tables 310 may be configured in any other suitable manner or may be implemented using any other suitable data structure. In some embodiments, one or more mapping functions may be employed to translate between a correlithm object 104 in an n-dimensional space and a real world output value 322. Additional information for implementing or emulating an actor 110 in hardware is described in Figure 5

 

A correlithm object processing system 300 uses a combination of a sensor table 308, a node table 200, and/or an actor table 310 to provide a specific set of rules that improve computer-related technologies by enabling devices to compare and to determine the degree of similarity between different data samples regardless of the data type and/or format of the data sample they represent. The ability to directly compare data samples having different data types and/or formatting is a new functionality that cannot be performed using conventional computing systems and data structures. Conventional systems require data samples to be of the same type and/or format in order to perform any kind of operation on the data samples. In some instances, some types of data samples are incompatible with each other and cannot be compared because there is no common format available. For example, conventional computers are unable to compare data samples of images with data samples of audio samples because there is no common format available. In contrast, a deviceimplementing a correlithm object processing system uses a combination of a sensor table 308, a node table 200, and/or an actor table 310 to compare and perform operations using correlithm objects 104 in the correlithm object domain regardless of the type or format of the original data samples. The correlithm object processing system 300 uses a combination of a sensor table 308, a node table 200, and/or an actor table 310 as a specific set of rules that provides a particular solution to dealing with different types of data samples and allows devices to perform operations on different types of data samples using correlithm objects 104 in the correlithm object domain. In some instances, comparing data samples as correlithm objects 104 is computationally more efficient and faster than comparing data samples in their original format. Thus, using correlithm objects 104 to represent data samples provides increased flexibility and improved performance compared to using other conventional data structures. The specific set of rules used by the correlithm object processing system 300 go beyond simply using routine and conventional activities in order to achieve this new functionality and performance improvements

 

In addition, correlithm object processing system 300 uses a combination of a sensor table 308, a node table 200, and/or an actor table 310 to provide a particular manner for transforming data samples between ordinal number representations and correlithm objects 104 in a correlithm object domain. For example, the correlithm object processing system 300 may be configured to transform a representation of a data sample into a correlithm object 104, to perform various operations using the correlithm object 104 in the correlithm object domain, and to transform a resulting correlithm object 104 into another representation of a data sample. Transforming data samples between ordinal number representations and correlithm objects 104 involves fundamentally changing the data type of data samples between an ordinal number system and a categorical number system to achieve the previously described benefits of the correlithm object processing system 300. 

Brief Description:

Figure 4 is a protocol diagram of an embodiment of a correlithm object process flow

Detailed Description:

Figure 4 is a protocol diagram of an embodiment of a correlithm object process flow 400. A user device 112 implements process flow 400 to emulate a correlithm object processing system 300 to perform operations using correlithm object 104 such as facial recognition. The user device 112 implements process flow 400 to compare different data samples (e.g. images, voice signals, or text) are to each other and to identify other objects based on the comparison. Process flow 400 provides instructions that allows user devices 112 to achieve the improved technical benefits of a correlithm object processing system 300. 

 

Conventional systems are configured to use ordinal numbers for identifying different data samples. Ordinal based number systems only provide information about the sequence order of numbers based on their numeric values, and do not provide any information about any other types of relationships for the data samples being represented by the numeric values such as similarity. In contrast, a user device 112 can implement or emulate the correlithm object processing system 300 which provides an unconventional solution that uses categorical numbers and correlithm objects 104 to represent data samples. For example, the system 300 may be configured to use binary integers as categorical numbers to generate correlithm objects 104 which enables the user device 112 to perform operations directly based on similarities between different data samples. Categorical numbers provide information about how similar different data sample are from each other. Correlithm objects 104 generated using categorical numbers can be used directly by the system 300 for determining how similar different data samples are from each other without relying on exact matches, having a common data type or format, or conventional signal processing techniques

 

A non-limiting example is provided to illustrate how the user device 112 implements process flow 400 to emulate a correlithm object processing system 300 to perform facial recognition on an image to determine the identity of the person in the image. In other examples, the user device 112 may implement process flow 400 to emulate a correlithm object processing system 300 to perform voice recognition, text recognition, or any other operation that compares different objects

 

At step 402, a sensor108 receives an input signal representing a data sample. For example, the sensor108 receives an image of person’sface as a real world input value 320. The input signal may be in any suitable data type or format. In one embodiment, the sensor108 may obtain the input signal in real-time from a peripheral device (e.g. a camera). In another embodiment, the sensor108 may obtain the input signal from a memory or database

 

At step 404, the sensor108 identifies a real world value entry in a sensor table 308 based on the input signal. In one embodiment, the system 300 identifies a real world value entry in the sensor table 308 that matches the input signal. For example, the real world value entries may comprise previously stored images. The sensor108 may compare the received image to the previously stored images to identify a real world value entry that matches the received image. In one embodiment, when the sensor108 does not find an exact match, the sensor108 finds a real world value entry that closest matches the received image

 

At step 406, the sensor108 identifies and fetches an input correlithm object 314 in the sensor table 308 linked with the real world value entry. At step 408, the sensor108 sends the identified input correlithm object 314 to the node 304. In one embodiment, the identified input correlithm object 314 is represented in the sensor table 308 using a categorical binary integer string. The sensor108 sends the binary string representing to the identified input correlithm object 314 to the node 304

 

At step 410, the node 304 receives the input correlithm object 314 and determines distances 106 between the input correlithm object 314 and each source correlithm object 104 in a node table 200. In one embodiment, the distance 106 between two correlithm objectscorrelithm objects 104 can be determined based on the differences between the bits of the two correlithm objectscorrelithm objects 104. In other words, the distance 106 between two correlithm objects can be determined based on how many individual bits differ between a pair of correlithm objects 104. The distance 106 between two correlithm objectscorrelithm objects 104 can be computed using hamming distance or any other suitable technique. In another embodiment, the distance 106 between two correlithm objectscorrelithm objects 104 can be determined using a minkowski distance such as the Euclidean or “straight-line” distance between the correlithm objects 104. For example, the distance 106 between a pair of correlithm objects 104 may be determined by calculating the square root of the sum of squares of the coordinate difference in each dimension

 

At step 412, the node 304 identifies a source correlithm object 104 from the node table 200 with the shortest distance 106. A source correlithm object 104 with the shortest distance from the input correlithm object 314 is a correlithm object 104 either matches or most closely matches the received input correlithm object 314. 

 

At step 414, the node 304 identifies and fetches a target correlithm object 206 in the node table 200 linked with the source correlithm object 104. At step 416, the node 304 outputs the identified target correlithm object 206 to the actor 110. In this example, the identified target correlithm object 206 is represented in the node table 200 using a categorical binary integer string. The node 304 sends the binary string representing to the identified target correlithm object 206 to the actor 110. 

 

At step 418, the actor 110 receives the target correlithm object 206 and determines distances between the target correlithm object 206 and each output correlithm object 316 in an actor table 310. The actor 110 may compute the distances between the target correlithm object 206 and each output correlithm object 316 in an actor table 310 using a process similar to the process described in step 410. 

 

At step 420, the actor 110 identifies an output correlithm object 316 from the actor table 310 with the shortest distance 106. An output correlithm object 316 with the shortest distance from the target correlithm object 206 is a correlithm object 206 either matches or most closely matches the received target correlithm object 206. 

 

At step 422, the actor 110 identifies and fetches a real world output value in the actor table 310 linked with the output correlithm object 316. The real world output value may be any suitable type of data sample that corresponds with the original input signal. For example, the real world output value may be text that indicates the name of the person in the image or some other identifier associated with the person in the image. As another example, the real world output value may be an audio signal or sample of the name of the person in the image. In other examples, the real world output value may be any other suitable real world signal or value that corresponds with the original input signal. The real world output value may be in any suitable data type or format

 

At step 424, the actor 110 outputs the identified real world output value. In one embodiment, the actor 110 may output the real world output value in real-time to a peripheral device (e.g. a display or a speaker). In one embodiment, the actor 110 may output the real world output value to a memory or database. In one embodiment, the real world output value is sent to another sensor108. For example, the real world output value may be sent to another sensor108 as an input for another process

Brief Description:

Figure 5 is a schematic diagram of an embodiment a computer architecture for emulating a correlithm object processing system;

Detailed Description:

Figure 5 is a schematic diagram of an embodiment a computer architecture 500 for emulating a correlithm object processing system 300 in a user device 112. The computer architecture 500 comprises a processor 502, a memory 504, a network interface 506, and an input-output (I/O) interface 508. The computer architecture 500 may be configured as shown or in any other suitable configuration

 

The processor 502 comprises one or more processors operably coupled to the memory 504. The processor 502 is any electronic circuitry including, but not limited to, state machines, one or more central processing unit (CPU) chips, logic units, cores (e.g. a multi-core processor), field-programmable gate array (FPGAs), applicationspecific integrated circuits (ASICs), graphics processing units (GPUs), or digital signal processors (DSPs). The processor 502 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. The processor 502 is communicatively coupled to and in signal communication with the memory 204. The one or more processors are configured to processdata and may be implemented in hardware or software. For example, the processor 502 may be 8-bit, 16-bit, 32-bit, 64-bit or of any other suitable architecture. The processor 502 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components

 

The one or more processors are configured to implement various instructions. For example, the one or more processors are configured to execute instructions to implement sensor engines 510, delay node engines 528, node engines 512, boss engines 530, and actor engines 514. In an embodiment, the sensor engines 510, the node engines 512, and the actor engines 514 are implemented using logic units, FPGAs, ASICs, DSPs, or any other suitable hardware

 

In one embodiment, the sensor engine 510 is configured to receive a real world value 320 as an input, to determine a correlithm object 206 based on the real world value 320, and to output the correlithm object 206. Examples of the sensor engine 510 in operation are described in Figure 4

 

In one embodiment, the node engine 512 is configured to receive a correlithm object 206 (e.g. an input correlithm object 206), to determine another correlithm object 206 based on the received correlithm object 206, and to output the identified correlithm object 206 (e.g. an output correlithm object 316). The node engine 512 is also configured to compute distances between pairs of correlithm objects 206. 

 

In one embodiment, the delay node engine 528 is configured to receive a correlithm object 206 and then output the correlithm object 206 after a predetermined amount of time has elapsed. In other words, the delay node engine 528 is configured to provide delays or delay lines for a correlithm object processing system. Examples of the delay node engine 528 in operation are described in FIGS. 6-11. 

 

In one embodiment, the boss engine 530 is configured to control and synchronize components within a correlithm object processing system. The boss engine 530 is configured to send commands (e.g. execute commands or output commands) to components within a correlithm object processing system to control their operation. Examples of the boss engine 530 in operation are described in FIGS. 14-17. 

 

In one embodiment, the actor engine 514 is configured to receive a correlithm object 206 (e.g. an output correlithm object 316), to determine a real world output value 322 based on the received correlithm object 206, and to output the real world output value 322. Examples of the actor engine 514 in operation are described in Figure 4.

 

The memory 504 comprises one or more non-transitory disks, tape drives, or solid-state drives, and may be used as an over-flow data storage device, to store programs when such programs are selected for execution, and to store instructions and data that are read during program execution. The memory 504 may be volatile or non-volatile and may comprise read-only memory (ROM), random-access memory (RAM), ternary content-addressable memory (TCAM), dynamic random-access memory (DRAM), and static random-access memory (SRAM). The memory 504 is operable to store sensor instructions 516, node instructions 518, actor instructions 520, sensor tables 308, node tables 200, actor tables 310, and/or any other data or instructions. The sensor instructions 516, the node instructions 518, the delay node instructions 522, the boss instructions 524, and the actor instructions 520 comprise any suitable set of instructions, logic, rules, or code operable to execute the sensor engine 510, node engine 512, the delay node engine 528, the boss engine 530, and the actor engine 514, respectively. 

 

The sensor tables 308, the node tables 200, and the actor tables 310 may be configured similar to the sensor tables 308, the node tables 200, and the actor tables 310 described in Figure 3, respectively. The boss table 526 generally comprises a list of components within a correlithm object processing system. Additional information about boss tables 526 is described in FIGS. 14-17. 

 

The network interface 506 is configured to enable wired and/or wireless communications. The network interface 506 is configured to communicate data with any other device or system. For example, the network interface 506 may be configured for communication with a modem, a switch, a router, a bridge, a server, or a client. The processor 502 is configured to send and receive data using the network interface 506

 

The I/O interface 508 may comprise ports, transmitters, receivers, transceivers, or any other devices for transmitting and/or receiving data with peripheral devices as would be appreciated by one of ordinary skill in the art upon viewing this disclosure. For example, the I/O interface 508 may be configured to communicate data between the processor 502 and peripheral hardware such as a graphical user interface, a display, a mouse, a keyboard, a key pad, and a touch sensor (e.g. a touch screen). 


Parts List

102

n-dimensional space

104

correlithm objects

106

distance

108

undefined

110

sensor

112

actor

114

user device

200

node table

202

source correlithm objects

204

correlithm objects

206

target correlithm object

208

correlithm object

210

n-dimensional space

212

n-dimensional space

300

item

302

sensor

304

node

306

actor

308

sensor table

310

actor tables

312

real world value

314

input correlithm object

316

output correlithm objects

318

real world values

320

item

322

item

324

326

item

400

correlithm object process flow

402

block

404

block

406

block

408

block

410

block

412

block

414

block

416

block

418

block

420

block

422

block

424

block

500

computer architecture

502

processor

504

memory

506

network interface

508

I/O interface

510

sensor engine

512

node engines

514

actor engines

516

sensor instructions

518

node instructions

520

actor instructions

522

delay node instructions

524

boss instructions

526

boss table

528

delay node engine

530

boss engine


Terms/Definitions

solid-state drives

touch sensor

step

text recognition

DSPs

similarities

electrical signals

TCAM

different number

special purpose machine

schematic view

schematic diagram

first correlithm object

particular correlithm object

counting

microprocessor

delay node engine

formatting

processing resources

degree

logical high values

two different people

conventional systems

unconventional manner

improvements

exact match

sensor table

input correlithm object

audio signals

different devices

indexing

one or more central processing unit

implementing

binary result

capabilities

traditional data

other suitable real world signal or value

other suitable architecture

person’s

randomly chosen points

display

data

relationships

types

facial recognition

minkowski distance

conventional data structures

previously stored input values

unconventional configuration

various embodiments

various instructions

, nodes

other operation

random-access memory

image

control unit

other components

closest matches

perspective view

represent data samples

related data samples

transceivers

audio signal

same number dimensions

biometric signals

probability

peripheral devices

identified correlithm object

dissimilar one data sample

facial hair

engines

similar different data sample

suitable type

complex signal processing techniques

performance improvements

e.g. user device

square root

comparing data samples

security or authentication applications

binary comparisons

input correlithm objects

data samples provides

system

routine

switch

other suitable data structure

search engine

entry

one or more non-transitory disks

different objects

correlithm processing system

ALU and store

shown

resources

other features

certain statistical tests

second plurality

exact matches

boss table

microcontroller

images and data samples

other words

correlithm object processing system results

similar data samples

points

previously described benefits

facial recognition application

mouse

nodes

instructions and data

glasses

n-dimensional space

processing power

state machines

operation

other respects

signal communication

communication

correlithm object process flow

engine

“straight-line” distance

their similarities

common format

other data sample

other adjacent correlithm objects

non-binary data set comparisons

time

other suitable number

several technical challenges

correlation

correlithm object processing system

audio signal or sample

special purpose computer

particular solution

speed and performance

their numeric values

name

identified target correlithm object

, boss engines

received image

resulting correlithm object

rules

distances

two data samples

people

numeric values

search tokens

tape drives

source

person and output

configuration

complex processes

other n-dimensional spaces

actor instructions

statistical properties

sensor engines

other known data samples

signal processing techniques

similarity

program execution

different data samples

processing power and resources

identifying matches

input data sample matches

programmable logic device

laptop computers

disclosure

one or more mapping functions

problems

e.g. data samples

first 10-bit string

real world value

graphical user interface

high-dimensional space

process flow

different n-dimensional spaces

feature

user device

previously stored images

processing systems

new functionality

audio samples

device stores

objects

technical solution

correlithm object

delays or delay lines

source correlithm objects

one or more processors

three dimensional spaces

flexibility

peripheral device

their original format

other suitable manner

input data samples

representation

same type and/or format

only one bit differs

pair

example

predetermined amount

sensor engine

sensor

FPGAs

code

embodiment

different data types and/or formats

biometric information

read-only memory

actor tables

boss engine

technical challenges

ordinal based number systems

source correlithm object

absence

comparison

input signal

process

computing device

suitable combination

DRAM

ordinal number values

delay node instructions

keyboard

ASICs

additional information

different images

speaker

categorical number system

aspects

non-limiting example

actor table

sequence order

conventional text-based search engine

specific set

instances

correlithm

hardware or software

other suitable dimension space

input data sample

unconventional solution

different data type

receivers

one embodiment

received input correlithm object

64-bit string

real world input value

results

registers

same person

data sample

suitable set

significant amounts

text

wireless communications

information

processing

device

or actor

exact partial matches

ordinal number representations

real-time

output commands

kind

50-dimensional space

second correlithm object

images

faster search time

problem

multi-core processor

characteristics

data sets

more sensors

signal, value

other types

conventional data processing systems

camera

coordinated operations

target correlithm object

node engine

combination

more detail

touch screen

embodiments

actor

node instructions

their correlithm objects

3-dimensional space

normal distribution

traditional data processing paradigm

pair correlithm objects

three dimensional space

improved results

10-dimensional space

client

desktop computers

electronic circuitry

memory

type

suitable data type

memory or database

dimension

peripheral hardware

other data or instructions

ALU operations

differences

actors 306 serve

result

significant amount

binary string

other benefits

contrast

different data types

real world output value

table

categorical binary strings

unknown person

second n-dimensional space

match

non-binary comparisons

improved performance

bridge

respects

identified real world output value

ternary content-addressable memory

real world value entries

fetches instructions

received correlithm object

instructions

e.g. floating point

arithmetic and logic operations

devices

other devices

sensor instructions

other suitable hardware

second 10-bit string

high-dimensional binary cube

processor

common data type

boss tables

time consuming

files

conventional signal processing techniques

hardware

technical advantage

technological improvements

space

processor registers

digital signal processors

target correlithm objects

64-dimensional space

computers

others

other objects

transmitters

conventional computer

correlithm object domain

same device

other examples

input-output

data type

distance

original data sample

two images

computer technologies

sensors

node tables

numbers

faster search times

received target correlithm object

e.g. images

amount

node table

matches

dimensions

first column

cores

field-programmable gate array

SRAM

specific integrated circuits

addition

person

original input signal

computing devices

instance

user device’s

conventional computers

boss instructions

computer-related technologies

actor engine

nine dimensional space

cases

categorical numbers

network

ports

voice signals

bits

coordinate difference

number

detail

other correlithm objects

third correlithm object

benefits

binary integers

two correlithm objects

examples

other special purpose computer platforms

plurality

conventional search engines

router

shortest distance

node engines

resulting real world output value

second column

improved technical benefits

application

pairs

I/O interface

hamming distance

other identifier

data structure

ordinary skill

mobile phones

n-dimensional spaces

pixel values

arithmetic logic unit

modem

node

conventional computing systems

static random-access memory

identified input correlithm object

real world values

second example

squares

functionality

other device or system

actors

ordinal binary integers

value

other embodiments

user devices

operations

programs

actor engines

different lighting conditions

data structures

network interface

categorical binary integer string

components

ordinal numbers

commands

mapping

binary comparison

search engines

follows

over-flow data storage device

other conventional data structures

GPUs

suitable number and/or configuration

real world data samples

such programs

protocol diagram

ordinal number system

first n-dimensional space

similar characteristics or features

other suitable technique

operands

search token

third 10-bit string

original data samples

voice recognition

server

interfaces

image processing

paradigms

other number systems

XOR operation

second set

their original data type and/or format

different types

preceding

“n-space

other suitable configuration

10-bit string

performing operations

increased flexibility

similar different data samples

link

their corresponding correlithm objects

many individual bits

previous example

only data samples

logic

output correlithm objects

new types

first plurality

input

data samples

electric signals

mathematical calculations

known people

various operations

point

particular manner

real world value entry

other suitable type

FIGS

complicated signal processing

computer architecture

output correlithm object

identity

100-dimensional space

i.e. match

conventional activities

graphics processing units

trivial task

first set

data set

tablet computers

face

string

different numeric values

dynamic random-access memory

transforming data samples

list

sensor tables

real world values or representations

format

correlithm objects

numerical order

their operation

logic units

type or format

execution

other data samples