Organization: Public

Battery Pack Arrangement


Drawings

Brief Description:

Figure 1 is a diagram of a possible battery pack arrangement comprised of multiple cells, and monitored and controlled by a battery Control Module

Detailed Description:

A traction battery pack(s) 102 may be constructed from a variety of chemical formulations. Typical battery pack chemistries are lead acid, nickel-metal hydride (NIMH) or Lithium-Ion. Figure 1 shows a typical traction battery pack(s) 102 in a simple series configuration of N battery cell(s) 104. Other traction battery pack(s) 102, however, may be composed of any number of individual battery cells connected in series or parallel or some combination thereof. A typical system may have a one or more controllers, such as a battery Energy Control Module (BECM 108) that monitors and controls the performance of the traction battery pack(s) 102. The BECM 108 may monitor several battery pack level characteristics such as pack current 112, pack voltage 114 and pack temperature 110. The BECM 108 may have non-volatile memory such that data may be retained when the BECM 108 is in an off condition. Retained data may be available upon the next key cycle

In addition to the pack level characteristics, there may be battery cell(s) 104level characteristics that are measured and monitored. For example, the terminal voltage, current, and temperature of each battery cell(s) 104 may be measured. A system may use a sensor module(s) 106 to measure the battery cell(s) 104  characteristics. Depending on the capabilities, the sensor module(s) 106 may measure the characteristics of one or multiple of the battery cell(s) 104. The traction battery pack(s) 102 may utilize up to N.sub.c sensor module(s) 106 to measure the characteristics of all the battery cell(s) 104. Each sensor module(s) 106 may transfer the measurements to the BECM 108 for further processing and coordination. The sensor module(s) 106 may transfer signals in analog or digital form to the BECM 108. In some embodiments, the sensor module(s) 106functionality may be incorporated internally to the BECM 108. That is, the sensor module(s) 106hardware may be integrated as part of the circuitry in the BECM 108 and the BECM 108 may handle the processing of raw signals

It may be useful to calculate various characteristics of the battery pack. Quantities such a battery power capability and battery state of charge may be useful for controlling the operation of the battery pack as well as any electrical loads receiving power from the battery pack. Battery power capability is a measure of the maximum amount of power the battery can provide or the maximum amount of power that the battery can receive for the next specified time period, for example, 1 second or less than one second. Knowing the battery power capability allows electrical loads to be managed such that the power requested is within limits that the battery can handle. 

Battery pack state of charge (SOC) gives an indication of how much charge remains in the battery pack. The battery pack SOC may be output to inform the driver of how much charge remains in the battery pack, similar to a fuel gauge. The battery pack SOC may also be used to control the operation of an electric or hybrid-electric vehicle. Calculation of battery pack or cell SOC can be accomplished by a variety of methods. One possible method of calculating battery SOC is to perform an integration of the battery pack current over time. One possible disadvantage to this method is that the current measurement may be noisy. Possible inaccuracy in the state of charge may occur due to the integration of this noisy signal over time. Calculation of battery pack or cell SOC can also be accomplished by using an observer, whereas a battery model is used for construction of the observer, with measurements of battery current, terminal voltage, and temperature. Battery model parameters may be identified through recursive estimation based on such measurements

The accuracy of voltage and current sensor measurement depend on many factors. Noise may impact the signal that is measured. For example, accuracy of a hall-effect type current sensor may depend on shielding the sensor and conductors from environmental magnetic fields. Biases in the sensor measurements may also be present. Prior art systems may utilize current measurements taken prior to contactor closing to calculate a current measurement bias. Before the contactor closes, there should be no current flowing

A battery management system may estimate various battery parameters based on the sensor measurements. Current and voltage sensor biases and inaccuracies may be time-varying in nature. Therefore, pre-contactor close compensation may not be accurate enough over the entire operating time of the sensors. The short sample time before the contactor is closed only allows limited sampling of the current sensor. The pre-contactor close samples may not be accurate due to the rise time of the current sensor from BECMstart-up. Another significant issue may be the lack of exact synchronization in voltage and current measurements. Battery parameter identification depends on well-defined inputs (current) and outputs (terminal voltage). A loss of synchronization between the signals may result in measured data that does not accurately represent the real battery behavior which may lead to erroneous parameter estimation


Parts List

102

traction battery pack(s)

104

battery cell(s)

106

sensor module(s)

108

BECM

110

pack temperature

112

pack current

114

pack voltage


Terms/Definitions

circuitry

measure

much charge

sensor and conductors

pack current

system

measured data

exact synchronization

time

raw signals

current measurement bias

battery management system

pack temperature

construction

methods

current measurement

such measurements

signal

quantities

contactor closing

processing

next specified time period

loss

accuracy

battery model parameters

lead acid, nickel-metal hydride

voltage and current measurements

next key cycle

short sample time

data

integration

pack voltage

addition

hybrid-electric vehicle

individual battery cells

significant issue

method

retained data

terminal voltage

analog

sensor measurements

voltage sensor biases and inaccuracies

measurements

recursive estimation

erroneous parameter estimation

combination

digital form

battery pack SOC

limits

variety

real battery behavior

level characteristics

battery power capability and battery state

further processing and coordination

operation

chemical formulations

possible inaccuracy

contactor

limited sampling

sensor module(s)

indication

battery parameter identification

Battery Energy Control Module (BECM)

output

typical system

outputs

noise

fuel gauge

current sensor measurement

cell

off condition

battery pack or cell SOC

characteristics

various battery parameters

calculation

state

signals

synchronization

lack

Lithium-Ion

observer

current measurements

maximum amount

functionality

battery SOC

battery model

many factors

n battery cells

traction battery pack(s)

battery

part

entire operating time

performance

pre-contactor close compensation

noisy signal

current sensor

electrical loads

simple series configuration

battery cell(s)

non-volatile memory

battery pack current

embodiments

capabilities

series

nature

various characteristics

hall-effect type current sensor

power

typical battery pack chemistries

battery power capability

driver

temperature

NIMH

traction battery pack

pack level characteristics

biases

other battery packs

environmental magnetic fields

all the battery cells

pre-contactor close samples

battery current, terminal voltage

charge

current flowing

start-up

hardware

battery pack

BECM

sensors

example

rise time

prior art systems

number

battery pack state

well-defined inputs

Augmented Reality


Drawings

Brief Description:

illustrates an embodiment of a superimposing logic 102.

Detailed Description:

Figure 1 illustrates an embodiment of an augmented reality environment 100. A user 110 wearing headset 114 interacts with physical objects virtualized in the augmented reality environment 100. In this example the user 110 interacts with either a purely virtual document, or a physical document that is virtualized as a virtual document 112 on a virtual surface 104 in the augmented reality environment 100. In this embodiment, an imaging sensor 108 is directed toward a physical surface 106, and superimposing logic 102 receives a sensor output 116 (e.g., image or video) from the imaging sensor 108.  Superimposing logic 102 transforms the sensor output 116 into a virtual document 112 superimposed on a virtual surface 104 representing the physical surface 106 in the augmented reality environment 100.

In other embodiments there may be no physical surface 106 and no physical document on the physical surface 106, in which case the environment would be a purely virtual reality (VR) environment, not an augmented reality environment 100. Thus there are many possibilities for the environment – it could be purely virtual, or a physical surface 106 that is virtualized and augmented with a virtual document, or both the physical surface 106 and a physical document could be virtualized.

Brief Description:

illustrates an AR or VR system 200 in accordance with one embodiment.

Detailed Description:

Figure 2 illustrates an AR or VR system 200 in accordance with one embodiment. A

virtual environment 202 receives input from the user 214 and in response sends an interaction signal to a virtual object 206, a virtual surface 210 or an application 212.  The virtual object 206 or virtual surface 210 or application 212 sends an action to an operating system 204 and in response the operating system 204 operates the hardware 208 to implement the action in the augmented or virtual environment.  

Brief Description:

illustrates a device 300 in accordance with one embodiment.

Detailed Description:

Figure 3 illustrates a perspective view of a wearable augmented reality (“AR”) device ( device 300), from the perspective of a wearer of the device 300 (“AR user”). The device 300 is a computer device in the form of a wearable headset. 

The device 300 comprises a headpiece 302, which is a headband, arranged to be worn on the wearer’s head. The headpiece 302 has a central portion 304 intended to fit over the nose bridge of a wearer, and has an inner curvature intended to wrap around the wearer’s head above their ears.

The headpiece 302 supports a left optical component 306 and a right optical component 308, which are waveguides. For ease of reference herein an optical component will be considered to be either a left or right component, because in the described embodiment the components are essentially identical apart from being mirror images of each other. Therefore, all description pertaining to the left-hand component also pertains to the right-hand component. The device 300 comprises augmented reality device logic 400 that is depicted in Figure 4.

The augmented reality device logic 400 comprises a graphics engine 402, which may comprise a micro display and imaging optics in the form of a collimating lens (not shown). The micro display can be any type of image source, such as liquid crystal on silicon (LCOS) displays, transmissive liquid crystal displays (LCD), matrix arrays of LED’s (whether organic or inorganic) and any other suitable display. The display is driven by circuitry known in the art to activate individual pixels of the display to generate an image. Substantially collimated light, from each pixel, falls on an exit pupil of the graphics engine 402. At the exit pupil, the collimated light beams are coupled into each of the left optical component 306 and the right optical component 308 into a respective left in-coupling zone 310 and rightin-coupling zone 312. In-coupled light is then guided, through a mechanism that involves diffraction and TIR, laterally of the optical component in a respective left intermediate zone 314 and 416, and also downward into a respective left exit zone 318 and right exit zone 320 where it exits towards the users’ eye. 

The collimating lens collimates the image into a plurality of beams, which form a virtual version of the displayed image, the virtual version being a virtual image at infinity in the optics sense. The light exits as a plurality of beams, corresponding to the input beams and forming substantially the same virtual image, which the lens of the eye projects onto the retina to form a real image visible to the user. In this manner, the left optical component 306 and the right optical component 308 project the displayed image onto the wearer’s eyes. 

The various optical zones can, for example, be suitably arranged diffractions gratings or holograms. Each optical component has a refractive index n which is such that total internal reflection takes place to guide the beam from the light engine along the respective intermediate expansion zone, and down towards respective the exit zone.

Each optical component is substantially transparent, whereby the wearer can see through it to view a real-world environment in which they are located simultaneously with the projected image, thereby providing an augmented reality experience.

To provide a stereoscopic image, i.e. that is perceived as having 3D structure by the user, slightly different versions of a 2D image can be projected onto each eyefor  example from multiple graphics engine 402  (i.e. two micro displays), or from the same light engine (i.e. one micro display) using suitable optics to split the light output from the single display.

The device 300 is just one exemplary configuration. For instance, where two light-engines are used, these may instead be at separate locations to the right and left of the device (near the wearer’s ears). Moreover, whilst in this example, the input beams that form the virtual image are generated by collimating light from the display, an alternative light engine based on so-called scanning can replicate this effect with a single beam, the orientation of which is fast modulated whilst simultaneously modulating its intensity and/or colour. A virtual image can be simulated in this manner that is equivalent to a virtual image that would be created by collimating light of a (real) image on a display with collimating optics. Alternatively, a similar AR experience can be provided by embedding substantially transparent pixels in a glass or polymer plate in front of the wearer’s eyes, having a similar configuration to the left optical component 306 and right optical component 308 though without the need for the zone structures.

Other headpiece 302 embodiments are also within the scope of the subject matter. For instance, the display optics can equally be attached to the users head using a frame (in the manner of conventional spectacles), helmet or other fit system. The purpose of the fit system is to support the display and provide stability to the display and other head borne systems such as tracking systems and cameras. The fit system can be designed to meet user population in anthropometric range and head morphology and provide comfortable support of the display system.

The device 300 also comprises one or more camera 404 — for example left stereo camera 322 and right stereo camera 324 mounted on the headpiece 302 and configured to capture an approximate view (“field of view”) from the user’s left and right eyes respectfully in this example. The cameras are located towards either side of the user’s head on the headpiece 302, and thus capture images of the scene forward of the device form slightly different perspectives. In combination, the stereo camera‘s capture a stereoscopic moving image of the real-wold environment as the device moves through it. A stereoscopic moving image means two moving images showing slightly different perspectives of the same scene, each formed of a temporal sequence of frames to be played out in quick succession to replicate movement. When combined, the two images give the impression of moving 3D structure.

A left microphone 326 and a right microphone 328 are located at the front of the headpiece (from the perspective of the wearer), and left and right channel speakers, earpiece or other audio output transducers are to the left and right of the headpiece 302. These are in the form of a pair of bone conduction audio transducers functioning as a left speaker 330 and right speaker 332 audio channel output.

Brief Description:

illustrates an augmented reality device logic 400 in accordance with one embodiment.

Detailed Description:

Figure 4 illustrates components of an exemplary augmented reality device logic 400. The augmented reality device logic 400 comprises a graphics engine 402, a camera 404, processing units 406, including one or more CPU 408 and/or GPU 410, a WiFi 412 wireless interface, a Bluetooth 414 wireless interface, speakers 416microphones 418, and one or more memory 420.

The processing units 406 may in some cases comprise programmable devices such as bespoke processing units optimized for a particular function, such as AR related functions. The augmented reality device logic 400 may comprise other components that are not shown, such as dedicated depth sensors, additional interfaces etc.

 

Some or all of the components in Figure 4 may be housed in an AR headset. In some embodiments, some of these components may be housed in a separate housing connected or in wireless communication with the components of the AR headset. For example, a separate housing for some components may be designed to be worn or a belt or to fit in the wearer’s pocket, or one or more of the components may be housed in a separate computer device (smartphone, tablet, laptop or desktop computer etc.) which communicates wirelessly with the display and camera apparatus in the AR headset, whereby the headset and separate device constitute the full augmented reality device logic 400.

The memory 420 comprises logic 422 to be applied to the processing units 406 to execute. In some cases, different parts of the logic 422 may be executed by different components of the processing units 406. The logic 422 typically comprises code of an operating system, as well as code of one or more applications configured to run on the operating system to carry out aspects of the processes disclosed herein.

Brief Description:

illustrates an AR device 500 that may implement aspects of the machine processes described herein.

Detailed Description:

Figure 5 illustrates more aspects of an AR device 500 according to one embodiment.  The AR device 500 comprises processing units 502, input devices 504, memory 506,  output devices 508, storage devices 510, a network interface 512, and various logic to carry out the processes disclosed herein.

The input devices 504 comprise transducers that convert physical phenomenon into machine internal signals, typically electrical, optical or magnetic signals. Signals may also be wireless in the form of electromagnetic radiation in the radio frequency (RF) range but also potentially in the infrared or optical range. Examples of input devices 504 are keyboards which respond to touch or physical pressure from an object or proximity of an object to a surface, mice which respond to motion through space or across a plane, microphones which convert vibrations in the medium (typically air) into device signals, scanners which convert optical patterns on two or three dimensional objects into device signals. The signals from the input devices 504 are provided via various machine signal conductors (e.g., busses or network interfaces) and circuits to memory 506

The memory 506 provides for storage (via configuration of matter or states of matter) of signals received from the input devices 504, instructions and information for controlling operation of the processing units 502, and signals from storage devices 510. The memory 506 may in fact comprise multiple memory devices of different types, for example random access memory devices and non-volatile (e.g., FLASH memory) devices.

Information stored in the memory 506 is typically directly accessible to the processing units 502 of the device. Signals input to the AR device 500 cause the reconfiguration of the internal material/energy state of the memory 506, creating logic that in essence forms a new machine configuration, influencing the behavior of the AR device 500 by affecting the behavior of the processing units 502 with control signals (instructions) and data provided in conjunction with the control signals. 

The storage devices 510 may provide a slower but higher capacity machine memory capability. Examples of storage devices 510 are hard disks, optical disks, large capacity flash memories or other non-volatile memory technologies, and magnetic memories. 

The processing units 502 may cause the configuration of the memory 506 to be altered by signals in the storage devices 510. In other words, the processing units 502 may cause data and instructions to be read from storage devices 510 in the memory 506 from which may then influence the operations of processing units 502 as instructions and data signals, and from which it may also be provided to the output devices 508. The processing units 502 may alter the content of the memory 506 by signaling to a machine interface of memory 506 to alter the internal configuration, and then converted signals to the storage devices 510 to alter its material internal configuration. In other words, data and instructions may be backed up from memory 506, which is often volatile, to storage devices 510, which are often non-volatile.

Output devices 508 are transducers which convert signals received from the memory 506 into physical phenomenon such as vibrations in the air, or patterns of light on a machine display, or vibrations (i.e., haptic devices) or patterns of ink or other materials (i.e., printers and 3-D printers).  

The network interface 512 receives signals from the memory 506 or processing units 502  and converts them into electrical, optical, or wireless signals to other machines, typically via a machine network. The network interface 512 also receives signals from the machine network and converts them into electrical, optical, or wireless signals to the memory 506 or processing units 502.

Brief Description:

illustrates an AR device logic 600 in accordance with one embodiment.

Detailed Description:

Figure 6 illustrates a functional block diagram of an embodiment of AR device logic 600. The AR device logic 600 comprises the following functional modules: a rendering engine 616local augmentation logic 614, local modeling logic 608, a model aggregator 616 (deleted)device tracking logic 606, an encoder 612, and a decoder 620. Each of these functional modules may be implemented in software, dedicated hardware, firmware, or a combination of these logic types.

The rendering engine 616 controls the graphics engine 618 to generate a stereoscopic image visible to the wearer, i.e. to generate slightly different images that are projected onto different eyes by the optical components of a headset substantially simultaneously, so as to create the impression of 3D structure.

The stereoscopic image is formed by rendering engine 616 rendering at least one virtual display element (“augmentation”), which is perceived as a 3D element, i.e. having perceived 3D structure, at a real-world location in 3D space by the user.

An augmentation is defined by an augmentation object stored in the memory 602. The augmentation object comprises: location data defining a desired location in 3D space for the virtual element (e.g. as (x,y,z) Cartesian coordinates); structural data defining 3D surface structure of the virtual element, i.e. a 3D model of the virtual element; and image data defining 2D surface texture of the virtual element to be applied to the surfaces defined by the 3D model. The augmentation object may comprise additional information, such as a desired orientation of the augmentation.

The perceived 3D effects are achieved though suitable rendering of the augmentation object. To give the impression of the augmentation having 3D structure, a stereoscopic image is generated based on the 2D surface and 3D augmentation model data in the data object, with the augmentation being rendered to appear at the desired location in the stereoscopic image.

A 3D model of a physical object is used to give the impression of the real-world having expected tangible effects on the augmentation, in the way that it would a real-world object. The 3D model represents structure present in the real world, and the information it provides about this structure allows an augmentation to be displayed as though it were a real-world 3D object, thereby providing an immersive augmented reality experience. The 3D model is in the form of 3D mesh.

For example, based on the model of the real-world, an impression can be given of the augmentation being obscured by a real-world object that is in front of its perceived location from the perspective of the user; dynamically interacting with a real-world object, e.g. by moving around the object; statically interacting with a real-world object, say by sitting on top of it etc.

Whether or not real-world structure should affect an augmentation can be determined based on suitable rendering criteria. For example, by creating a 3D model of the perceived AR world, which includes the real-world surface structure and any augmentations, and projecting it onto a plane along the AR user’s line of sight as determined using pose tracking (see below), a suitable criteria for determining whether a real-world object should be perceived as partially obscuring an augmentation is whether the projection of the real-world object in the plane overlaps with the projection of the augmentation, which could be further refined to account for transparent or opaque real world structures. Generally the criteria can depend on the location and/or orientation of the augmented reality device and/or the real-world structure in question.

An augmentation can also be mapped to the mesh, in the sense that its desired location and/or orientation is defined relative to a certain structure(s) in the mesh. Should that structure move and/or rotate causing a corresponding change in the mesh, when rendered properly this will cause corresponding change in the location and/or orientation of the augmentation. For example, the desired location of an augmentation may be on, and defined relative to, a table top structure; should the table be moved, the augmentation moves with it. Object recognition can be used to this end, for example to recognize a known shape of table and thereby detect when the table has moved using its recognizable structure. Such object recognition techniques are known in the art.

An augmentation that is mapped to the mash in this manner, or is otherwise associated with a particular piece of surface structure embodied in a 3D model, is referred to an “annotation” to that piece of surface structure. In order to annotate a piece of real-world surface structure, it is necessary to have that surface structure represented by the 3D model in question—without this, the real-world structure cannot be annotated.

The local modeling logic 608 generates a local 3D model “LM” of the environment in the memory 602, using the AR device’s own sensor(s) e.g. cameras 610 and/or any dedicated depth sensors etc. The local modeling logic 608 and sensor(s) constitute sensing apparatus.

The device tracking logic 606 tracks the location and orientation of the AR device, e.g. a headset, using local sensor readings captured from the AR device. The sensor readings can be captured in a number of ways, for example using the cameras 610  and/or other sensor(s) such as accelerometers. The device tracking logic 606 determines the current location and orientation of the AR device and provides this information to the rendering engine 616, for example by outputting a current “pose vector” of the AR device. The pose vector is a six dimensional vector, for example (x, y, z, P, R, Y) where (x,y,z) are the device’s Cartesian coordinates with respect to a suitable origin, and (P, R, Y) are the device’s pitch, roll and yaw with respect to suitable reference axes.

The rendering engine 616 adapts the local model based on the tracking, to account for the movement of the device i.e. to maintain the perception of the as 3D elements occupying the real-world, for example to ensure that static augmentations appear to remain static (which will in fact be achieved by scaling or rotating them as, from the AR user’s perspective, the environment is moving relative to them).

The encoder 612 receives image data from the cameras 610 and audio data from the microphones 604 and possibly other types of data (e.g., annotation or text generated by the user of the AR device using the local augmentation logic 614) and transmits that infomation to other devices, for example the devices of collaborators in the AR environment. The decoder 620 receives an incoming data stream from other devices, and extracts audio, video, and possibly other types of data (e.g., annotations, text) therefrom.


Parts List

100

augmented reality environment

102

superimposing logic

104

virtual surface

106

physical surface

108

imaging sensor

110

user

112

virtual document

114

headset

116

sensor output

200

AR or VR system

202

virtual environment

204

operating system

206

virtual object

208

hardware

210

virtual surface

212

application

214

user

300

device

302

headpiece

304

central portion

306

left optical component

308

right optical component

310

left in-coupling zone

312

rightin-coupling zone

314

left intermediate zone

316

right intermediate zone

318

left exit zone

320

right exit zone

322

left stereo camera

324

right stereo camera

326

left microphone

328

right microphone

330

left speaker

332

right speaker

400

augmented reality device logic

402

graphics engine

404

camera

406

processing units

408

CPU

410

GPU

412

WiFi

414

Bluetooth

416

speakers

418

microphones

420

memory

422

logic

500

AR device

502

processing units

504

input devices

506

memory

508

output devices

510

storage devices

512

network interface

514

logic

516

logic

518

logic

520

logic

600

AR device logic

602

memory

604

microphones

606

device tracking logic

608

local modeling logic

610

cameras

612

encoder

614

local augmentation logic

616

rendering engine

618

graphics engine

620

decoder

622

speakers


Terms/Definitions

virtual surface

projection location

camera

imaging sensor

texture image

logic

filtered texture

virtual environment

interaction

system

virtual reality

the computer-generated simulation of a three-dimensional environment that can be interacted with in a seemingly real or physical way by a person using special electronic equipment, such as a headset with a display and gloves fitted with sensors.

augmented reality

technology that superimposes computer-generated imagery on a user’s view of the real world, thus providing a composite view.

virtualize

converting a physical thing to a computer-generated simulation of that thing.

Implementation of Cloud Infrastructure


Drawings

Brief Description:

illustrates an example system 100 in accordance with one embodiment.

Detailed Description:

Referring now to Figure 1, an example system 100 includes a cloud infrastructure 102 that includes a scanning service 104 and virtual machines running on one or more virtual private cloud 114. Applications instantiated on virtual machines within the virtual private cloud 114 may access one or more clouddata stores 106 for storage of data. Administrators may configure the virtual private cloud in zones, and may architect applications to store in and receive data from the cloud data store 106 so as to provide fault tolerance and availability

While the cloud infrastructure 102, scanning service 104 and virtual machine instances in the virtual private cloud 114 may be described with respect to cloud-based infrastructure generally, and respect to Amazon Web Services (AWS) and AWS s3 buckets as an example implementation, it should be understood that the architecture and concept may be used with any suitable cloud service and related storage system. For example, cloud infrastructure services available from Microsoft Azure, CenturyLink Cloud, VMware, rackspace, joyent, and google may be suitable in various implementations as well as other cloud infrastructure or infrastructure-as-a-service providers with adjustments or modifications as may be needed for a particular implementation

Deployment of the scanning service 104 may be accomplished with a workflow that is intended to be relatively simple for an administrator to initiate and manage, and relieve a requirement to deploy and manage an agent on application templates or instances as they are created. This may enable, for example, in some implementations, a usage-based billing model as compared to a per-seat license for each image created, which may be desirable with cloud billing models, and particularly in an auto-scaling environment. As instances are created and shut down, instantiations of the scanning service 104 may be based on the load on the scanning service 104load, and may be managed, for example, by the security manager 116, rather than the administrator of the applications running in the virtual private cloud 114

In some implementations, installation and registration may involve setting permissions and authentication configuration, so that a cloud scanning provider handles administration of the scanning application and datasets without additional impact to customers’ workflows. This reduces complexity for the application administrator when adding data protection capability to applications

The system 100 includes a cloud infrastructure service 102 that provides computing resources for execution of software applications, data storage, and resource management, and may provide other services as well. In an example implementation, the cloud infrastructure 102 is implemented with the AWS service, although as mentioned above other suitable cloud infrastructure services may be used. 

The cloud infrastructure service 102 may include a cloud data store 106. The cloud data store 106 may be used by applications within the cloud infrastructure 102 to store data. The cloud data store 106 may be used, for example, by a web application operating within the cloud infrastructure 102 to store files uploaded to the web application by a user. The cloud data store 106 may receive one or more files directly or indirectly from applications, such as mobile apps, operating on user device 122 or a mobile device 322B. In an AWS implementation, the cloud data store 106 may be implemented with the AWS Simple Storage Service S3. Other clouddata services may be used instead or in addition

The cloud infrastructure 102 may include a scanning service 104. The scanning service 104 may be implemented with one or more scanningapplications running on one or more virtual machines within the cloud infrastructure 102. The scanning service 104 may receive policies from a security manager 116 and also may provide status information, events, and alerts to the security manager 116

The security manager 116 may be implemented within the cloud infrastructure 102 or outside the cloud infrastructure 102. The security manager 116 may provide a web-based management interface for configuration of the scanning service 104 and for an administrator to manage their use of the scanning service 104 and potentially other security applications. For example, the security manager 116 may provide management for endpoint protection, firewalls, and so forth. In some implementations, one or more firewalls under management by the security manager 116 are included in the virtual private cloud 114 and may be managed by the security manager 116

The scanning service 104 may receive data updates from a datadistributionservice (DDS 118). data from the DDS 118 may include, for example, code updates and definitions of known or potential malicious files, portions of files, code, or content, or code that may be used to identify malicious files, applications, or the like. The definition files may contain one or more commands, definitions, patterns, or instructions, to be parsed and acted upon, matched, or the like. Patterns may include, for example, identifying files or portions of files that fit a specific pattern, or that were identified in malicious files. Patterns also may include, for example, identifying code that has the same effect of code that is known to be malicious. The data updates may be used by the scanning service 104 when scanningfiles

The scanning service 104 may exchange security-related information, such as files or portions of files and resource reputation information with a security datadata lookup service 120. The security datadata lookup service 120 may be provided within the cloud infrastructure 102 or outside the cloud infrastructure 102. The security datadata lookup service 120 may be used, for example, to check patterns identified by the scanning service 104, determine reputations of resources identified or provided by the scanning service 104, and so forth. The scanning service 104 may provide files or data to the data lookup service 120 for further analysis. In some implementations, the scanning service 104 may initiate sending data to the data lookup service 120 under a variety of circumstances, for example, if the scanning service 104 is unable to determine whether a file or a portion of a file is malicious, or the relevance of code or other content, or if the reputation of a file is unknown. The data lookup service 120 may request a file or data to be provided to the data lookup service 120 for further investigation

The scanning service 104 accesses files to be scanned directly from the cloud data store 106 that is used by the virtual machine instances, which avoids overhead and performance delay. The use of virtual machine instances for the scanning that are different from the virtual machine instances of the application facilitate management and reduce complexity. In some implementations, the scanning service provides alerts to an administrator, but does not attempt to control access to files. In some implementations, the scanning service may move files or change the name of files in order to control access. For example, to prevent access to a file, the scanning service may change the name or the location (e.g., path in a file system) of a file in order to prevent access. In some implementations, the scanning service 104 may replace a file with another file that is “clean.” 

In some implementations, file permissions are used to control use of files. For example, if the scanning service 104 has been configured with an account having the appropriate permissions, the scanning service 104 may change the permission of files in the cloud data store 106 to permit or deny access to files by the applications in the virtual private cloud 114. Use of file permissions to control file access provides security for data, without a need for lengthy setup or installation. This reduces the costs to deploy and provision and takes advantage of the benefits of the cloud, which is to distribute processing and avoid the need for custom infrastructure, which in turn reduces total cost of ownership for cloud applications

In some implementations in which permissions are used, an application stores a file in the cloud data store 106 with default permissions that permit access by applications running in the virtual private cloud 114. The scanning service 104 receives notification of the storage event from the cloud data store 106, and the scanning service 104 scans the file. If access to the file needs to be restricted based on the scan, the scanning service 104 changes the permission of the file so that applications in the virtual private cloud 114 can no longer access the file

In some implementations in which permissions are used, an application stores a file in the cloud data store 106 with default permissions that do not permit access by applications running in the virtual private cloud 114. The scanning service 104 receives notification of the storage event from the cloud data store 106, and the scanning service 104 scans the file. If access to the file needs to be restricted based on the scan, the scanning service 104 does not change the permission of the file so that applications in the virtual private cloud 114 still cannot access the file. If access to the file does not need to be restricted based on the scan, the scanning service 104 changes the permission of the file so that applications in the virtual private cloud 114 may access the file

The cloud infrastructure 102 may include a virtual private cloud 114 (VPC) including one or more computing resources on which virtual machine instances are implemented. For example, the virtual private cloud 114 may include one or more applications such as a software application, a web application, a virtual desktop, a server application, etc. Applications in the virtual private cloud 114 may access and store data in the cloud data store 106, depending on the permissions assigned to the files in the cloud data store 106. In some implementations, some or all of applications may be implemented on infrastructure inside or outside of the cloud. For example, applications may be implemented in a co-location facility or in a data center not associated with a cloud infrastructure. For example, applications may be implemented on a user device, such as a mobile app or desktop computer application. Applications implemented outside of the cloud may make use of cloud resources, such as cloud storage. Use of the scanning techniques described with respect to cloud storage may be useful even if the applications are partially or entirely implemented outside of the cloud infrastructure, for example, with the exception of the cloud storage

User devices 122 may be in communication with the data store. The user devices 122 may have applications that directly store data in the data store 106. The user devices 122 may be in communication with one or more applications in the virtual private cloud 114, which in turn store data in the data store 106. 

An example is presented in which the cloud data store 106 includes three files; clean files 108, clean files 110 and malicious file 112. The clean files 108, clean files 110, malicious file 112 may be any sort of data file or collection of data files (e.g., a word processing file, an image, a video, an archive collection of files, etc.). In this example, there may be a first clean file 108 and a second clean files 110. The clean files 108, clean files 110 may be clean in the sense that they do not contain content that would be identified by the scanning service 104 to require reporting or restriction. The cloud data store 106 also includes a third malicious file 112, which contains content may be identified by the scanning service 104 to require restriction of file access. For example, the malicious file 112 may include malware or other malicious content. For example, the file may include content that should be protected from distribution under a policy

In some implementations, access to the malicious file 112 by applications running on the virtual private cloud 114 may be prevented through the use of permissions associated with the malicious file 112 within the data store 106, while the clean files 108, clean files 110 may have other permissions assigned and so applications running the on the virtual private cloud 114 would not be blocked. As a result, applications running on the VPC may access the clean files 108, clean files 110 but not access the malicious file 112. In some implementations, the file names of the clean files 108, clean files 110 are not changed, but the file name of the malicious file 112 is changed such that applications running on the virtual private cloud 114 cannot access the malicious file 112. In some implementations, the clean files 108, clean files 110 are not moved, but the malicious file 112 is moved such that applications running on the virtual private cloud 114 cannot access the malicious file 112


Parts List

100

example system

102

cloud infrastructure

104

scanning service

106

cloud data store

108

clean files

110

clean files

112

malicious file

114

virtual private cloud

116

security manager

118

DDS

120

data lookup service

122

user devices


Terms/Definitions

file name

instantiations

web-based management interface

google

infrastructure-as-a-service providers

files or portions

elements

distribution

particular embodiments

security-related information

ordinary skill

further analysis

advantage

video

implementations

functional information one

data store

diamond

application stores

circuits

same effect

three files

code

Microsoft Azure

application templates or instances

digital signal processor circuit

malicious file

desirable order

relevance

firewalls

virtual private cloud (VPC)

malicious files

instructions

cloud-based infrastructure

virtual private clouds

syntax

account

joyent

file system

reporting or restriction

sort

s3 buckets

service

scanning service

clean files

potential malicious files

AWS service

alerts

data

their use

user devices

third file

server application

benefits

functionally equivalent circuits

cloud infrastructure service

particular programming language

default permissions

policies

co-location facility

total cost

mobile app

configuration

workflow

facility

portions

Amazon Web Services

administration

security data

turn

instances

initialization

per-seat license

second clean file

many routine program elements

AWS implementation

administrator

virtual desktop

reputation

permission

various implementations

Data Distribution Service (DDS)

AWS Simple Storage Service

rackspace

software applications

patterns

circumstances

application administrator

file names

hardware implementation

management

present invention

storage

computer software instructions or groups

computer software instructions

communication

customers’ workflows

mobile device

flow diagrams

example implementation

software application

example system

files or data

presently disclosed methods

store files

loops and variables

access

virtual private cloud

invention

agent

adjustments or modifications

cloud

permissions

location

first clean file

particular implementation

data protection capability

applications

VMware

word processing file

application

scanning techniques

resource management

cloud infrastructure services

computing resources

cloud resources

deploy and provision

additional impact

cloud scanning provider

reputations

cloud storage

specific pattern

block diagram

related storage system

endpoint protection

costs

cloud billing models

auto-scaling environment

requirement

custom infrastructure

infrastructure

exception

code updates and definitions

lengthy setup or installation

complexity

mobile apps

scan

virtual machines

storage event

portion

events

web application

unordered meaning

user

data file or collection

status information

resources

cloud infrastructure

figure

definition files

particular sequence

processing

data center

administrators

policy

potentially other security applications

restriction

application facilitate management

file

appropriate permissions

implementation

cloud applications

delay

processing and decision blocks

security

processing blocks

file permissions

virtual machine instances

computer software

result

data lookup service

specific integrated circuit

example

cloud data store

archive collection

file access

need

notification

respect

zones

spirit

execution

load

content

security manager

suitable cloud service

data storage

usage-based billing model

ownership

data lookup

steps

image

architecture and concept

user device

desktop computer application

groups

temporary variables

files

rectangular elements

fault tolerance and availability

installation and registration

file or data

permissions and authentication configuration

addition

CenturyLink Cloud

files and resource reputation information

further investigation

system

variety

scanning

name

data files

scanning application and datasets

data updates

malware