KLL Compiler Re-Write
This was many months of efforts in re-designing how the KLL compiler should work. The major problem with the original compiler was how difficult it was to extend language wise. This lead to many delays in KLL 0.4 and 0.5 being implemented. The new design is a multi-staged compiler, where even tokenization occurs over multiple stages. This allows individual parsing and token regexes to be expressed more simply without affect other expressions. Another area of change is the concept of Contexts. In the original KLL compiler the idea of a cache assigned was "hacked" on when I realized the language was "broken" (after nearly finishing the compiler). Since assignment order is generally considered not to matter for keymappings, I created a "cached" assignment where the whole file is read into a sub-datastructure, then apply to the master datastructure. Unfortunately, this wasn't really all that clear, so it was annoying to work with. To remedy this, I created KLL Contexts, which contain information about a group of expressions. Not only can these groups can be merged with other Contexts, they have historical data about how they were generated allowing for errors very late in processing to be pin-pointed back to the offending kll file. Backends work nearly the same as they did before. However, all call-backs for capability evaluations have been removed. This makes the interface much cleaner as Contexts can only be symbolically merged now. (Previously datastructures did evaluation merges where the ScanCode or Capability was looked up right before passing to the backend, but this required additional information from the backend). Many of the old parsing and tokenization rules have been reused, along with the hid_dict.py code. The new design takes advantage of processor pools to handle multithreading where it makes sense. For example, all specified files are loaded into ram simulatenously rather than sparingly reading from. The reason for this is so that each Context always has all the information it requires at all times. kll - Program entry point (previously kll.py) - Very small now, does some setting up of command-line args - Most command-line args are specified by the corresponding processing stage common/channel.py - Pixel Channel container classes common/context.py - Context container classes - As is usual with other files, blank classes inherit a base class - These blank classes are identified by the class name itself to handle special behaviour - And if/when necessary functions are re-implemented - MergeConext class facilitates merging of contexts while maintaining lineage common/expression.py - Expression container classes * Expression base class * AssignmentExpression * NameAssociationExpression * DataAssociationExpression * MapExpression - These classes are used to store expressions after they have finished parsing and tokenization common/file.py - Container class for files being read by the KLL compiler common/emitter.py - Base class for all KLL emitters - TextEmitter for dealing with text file templates common/hid_dict.py - Slightly modified version of kll_lib/hid_dict.py common/id.py - Identification container classes - Used to indentify different types of elements used within the KLL language common/modifier.py - Container classes for animation and pixel change functions common/organization.py - Data structure merging container classes - Contains all the sub-datastructure classes as well - The Organization class handles the merge orchestration and expression insertion common/parse.py - Parsing rules for funcparserlib - Much of this file was taken from the original kll.py - Many changes to support the multi-stage processing and support KLL 0.5 common/position.py - Container class dealing with physical positions common/schedule.py - Container class dealing with scheduling and timing events common/stage.py - Contains ControlStage and main Stage classes * CompilerConfigurationStage * FileImportStage * PreprocessorStage * OperationClassificationStage * OperationSpecificsStage * OperationOrganizationStage * DataOrganziationStage * DataFinalizationStage * DataAnalysisStage * CodeGenerationStage * ReportGenerationStage - Each of these classes controls the life-cycle of each stage - If multi-threading is desired, it must be handled within the class * The next stage will not start until the current stage is finished - Errors are handled such that as many errors as possible are recorded before forcing an exit * The exit is handled at the end of each stage if necessary - Command-line arguments for each stage can be defined if necessary (they are given their own grouping) - Each stage can pull variables and functions from other stages if necessary using a name lookup * This means you don't have to worry about over-arching datastructures emitters/emitters.py - Container class for KLL emitters - Handles emitter setup and selection emitters/kiibohd/kiibohd.py - kiibohd .h file KLL emitter - Re-uses some backend code from the original KLL compiler funcparserlib/parser.py - Added debug mode control examples/assignment.kll examples/defaultMapExample.kll examples/example.kll examples/hhkbpro2.kll examples/leds.kll examples/mapping.kll examples/simple1.kll examples/simple2.kll examples/simpleExample.kll examples/state_scheduling.kll - Updating/Adding rules for new compiler and KLL 0.4 + KLL 0.5 support
This commit is contained in:
parent
c2a798f1cb
commit
f610d0fb15
0
common/__init__.py
Normal file
0
common/__init__.py
Normal file
75
common/channel.py
Normal file
75
common/channel.py
Normal file
@ -0,0 +1,75 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Channel Containers
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Channel:
|
||||
'''
|
||||
Pixel Channel Container
|
||||
'''
|
||||
def __init__( self, uid, width ):
|
||||
self.uid = uid
|
||||
self.width = width
|
||||
|
||||
def __repr__( self ):
|
||||
return "{0}:{1}".format( self.uid, self.width )
|
||||
|
||||
|
||||
class ChannelList:
|
||||
'''
|
||||
Pixel Channel List Container
|
||||
'''
|
||||
def __init__( self ):
|
||||
self.channels = []
|
||||
|
||||
def setChannels( self, channel_list ):
|
||||
'''
|
||||
Apply channels to Pixel
|
||||
'''
|
||||
for channel in channel_list:
|
||||
self.channels.append( Channel( channel[0], channel[1] ) )
|
||||
|
||||
def strChannels( self ):
|
||||
'''
|
||||
__repr__ of Channel when multiple inheritance is used
|
||||
'''
|
||||
output = ""
|
||||
for index, channel in enumerate( self.channels ):
|
||||
if index > 0:
|
||||
output += ","
|
||||
output += "{0}".format( channel )
|
||||
|
||||
return output
|
||||
|
||||
def __repr__( self ):
|
||||
return self.strChannels()
|
||||
|
256
common/context.py
Normal file
256
common/context.py
Normal file
@ -0,0 +1,256 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Context Definitions
|
||||
* Generic (auto-detected)
|
||||
* Configuration
|
||||
* BaseMap
|
||||
* DefaultMap
|
||||
* PartialMap
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import copy
|
||||
import os
|
||||
|
||||
import common.organization as organization
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Context:
|
||||
'''
|
||||
Base KLL Context Class
|
||||
'''
|
||||
def __init__( self ):
|
||||
'''
|
||||
Context initialization
|
||||
'''
|
||||
# Each context may have one or more included kll files
|
||||
# And each of these files will have at least 1 Context
|
||||
self.kll_files = []
|
||||
|
||||
# File data assigned to each context
|
||||
# This info is populated during the PreprocessorStage
|
||||
self.lines = []
|
||||
self.data = ""
|
||||
self.parent = None
|
||||
|
||||
# Tokenized data sets
|
||||
self.classification_token_data = []
|
||||
self.expressions = []
|
||||
|
||||
# Organized Expression Datastructure
|
||||
self.organization = organization.Organization()
|
||||
|
||||
def __repr__( self ):
|
||||
# Build list of all the info
|
||||
return "(kll_files={0}, lines={1}, data='''{2}''')".format(
|
||||
self.kll_files,
|
||||
self.lines,
|
||||
self.data,
|
||||
)
|
||||
|
||||
def initial_context( self, lines, data, parent ):
|
||||
'''
|
||||
Used in the PreprocessorStage to update the initial line and kll file data
|
||||
|
||||
@param lines: Data split per line
|
||||
@param data: Entire context in a single string
|
||||
@param parent: Parent node, always a KLLFile
|
||||
'''
|
||||
self.lines = lines
|
||||
self.data = data
|
||||
self.parent = parent
|
||||
|
||||
def query( self, kll_expression, kll_type ):
|
||||
'''
|
||||
Query
|
||||
|
||||
Returns a dictionary of the specified property.
|
||||
Most queries should use this.
|
||||
|
||||
See organization.py:Organization for property_type details.
|
||||
|
||||
@param kll_expression: String name of expression type
|
||||
@param kll_type: String name of the expression sub-type
|
||||
|
||||
@return: context_name: (dictionary)
|
||||
'''
|
||||
return self.organization.data_mapping[ kll_expression ][ kll_type ]
|
||||
|
||||
|
||||
class GenericContext( Context ):
|
||||
'''
|
||||
Generic KLL Context Class
|
||||
'''
|
||||
|
||||
|
||||
class ConfigurationContext( Context ):
|
||||
'''
|
||||
Configuration KLL Context Class
|
||||
'''
|
||||
|
||||
|
||||
class BaseMapContext( Context ):
|
||||
'''
|
||||
Base Map KLL Context Class
|
||||
'''
|
||||
|
||||
|
||||
class DefaultMapContext( Context ):
|
||||
'''
|
||||
Default Map KLL Context Class
|
||||
'''
|
||||
|
||||
|
||||
class PartialMapContext( Context ):
|
||||
'''
|
||||
Partial Map KLL Context Class
|
||||
'''
|
||||
def __init__( self, layer ):
|
||||
'''
|
||||
Partial Map Layer Context Initialization
|
||||
|
||||
@param: Layer associated with Partial Map
|
||||
'''
|
||||
super().__init__()
|
||||
|
||||
self.layer = layer
|
||||
|
||||
|
||||
class MergeContext( Context ):
|
||||
'''
|
||||
Container class for a merged Context
|
||||
|
||||
Has references to the original contexts merged in
|
||||
'''
|
||||
def __init__( self, base_context ):
|
||||
'''
|
||||
Initialize the MergeContext with the base context
|
||||
Another MergeContext can be used as the base_context
|
||||
|
||||
@param base_context: Context used to seed the MergeContext
|
||||
'''
|
||||
super().__init__()
|
||||
|
||||
# List of context, in the order of merging, starting from the base
|
||||
self.contexts = [ base_context ]
|
||||
|
||||
# Copy the base context Organization into the MergeContext
|
||||
self.organization = copy.copy( base_context.organization )
|
||||
|
||||
# Set the layer if the base is a PartialMapContext
|
||||
if base_context.__class__.__name__ == 'PartialMapContext':
|
||||
self.layer = base_context.layer
|
||||
|
||||
def merge( self, merge_in, debug ):
|
||||
'''
|
||||
Merge in context
|
||||
|
||||
Another MergeContext can be merged into a MergeContext
|
||||
|
||||
@param merge_in: Context to merge in to this one
|
||||
@param debug: Enable debug out
|
||||
'''
|
||||
# Append to context list
|
||||
self.contexts.append( merge_in )
|
||||
|
||||
# Merge context
|
||||
self.organization.merge(
|
||||
merge_in.organization,
|
||||
debug
|
||||
)
|
||||
|
||||
# Set the layer if the base is a PartialMapContext
|
||||
if merge_in.__class__.__name__ == 'PartialMapContext':
|
||||
self.layer = merge_in.layer
|
||||
|
||||
def reduction( self ):
|
||||
'''
|
||||
Simplifies datastructure
|
||||
|
||||
NOTE: This will remove data, therefore, context is lost
|
||||
'''
|
||||
self.organization.reduction()
|
||||
|
||||
def paths( self ):
|
||||
'''
|
||||
Returns list of file paths used to generate this context
|
||||
'''
|
||||
file_paths = []
|
||||
|
||||
for kll_context in self.contexts:
|
||||
# If context is a MergeContext then we have to recursively search
|
||||
if kll_context.__class__.__name__ is 'MergeContext':
|
||||
file_paths.extend( kll_context.paths() )
|
||||
else:
|
||||
file_paths.append( kll_context.parent.path )
|
||||
|
||||
return file_paths
|
||||
|
||||
def files( self ):
|
||||
'''
|
||||
Short form list of file paths used to generate this context
|
||||
'''
|
||||
file_paths = []
|
||||
for file_path in self.paths():
|
||||
file_paths.append( os.path.basename( file_path ) )
|
||||
|
||||
return file_paths
|
||||
|
||||
def __repr__( self ):
|
||||
return "(kll_files={0}, organization={1})".format(
|
||||
self.files(),
|
||||
self.organization,
|
||||
)
|
||||
|
||||
def query_contexts( self, kll_expression, kll_type ):
|
||||
'''
|
||||
Context Query
|
||||
|
||||
Returns a list of tuples (dictionary, kll_context) doing a deep search to the context leaf nodes.
|
||||
This results in pre-merge data and is useful for querying information about files used during compilation.
|
||||
|
||||
See organization.py:Organization for property_type details.
|
||||
|
||||
@param kll_expression: String name of expression type
|
||||
@param kll_type: String name of the expression sub-type
|
||||
|
||||
@return: context_name: (dictionary, kll_context)
|
||||
'''
|
||||
# Build list of leaf contexts
|
||||
leaf_contexts = []
|
||||
for kll_context in self.contexts:
|
||||
# Recursively search if necessary
|
||||
if kll_context.__class__.__name__ == 'MergeContext':
|
||||
leaf_contexts.extend( kll_context.query_contexts( kll_expression, kll_type ) )
|
||||
else:
|
||||
leaf_contexts.append( ( kll_context.query( kll_expression, kll_type ), kll_context ) )
|
||||
|
||||
return leaf_contexts
|
||||
|
191
common/emitter.py
Normal file
191
common/emitter.py
Normal file
@ -0,0 +1,191 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Emitter Base Classes
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import re
|
||||
import os
|
||||
import sys
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Emitter:
|
||||
'''
|
||||
KLL Emitter Base Class
|
||||
|
||||
NOTE: Emitter should do as little as possible in the __init__ function.
|
||||
'''
|
||||
def __init__( self, control ):
|
||||
'''
|
||||
Emitter initialization
|
||||
|
||||
@param control: ControlStage object, used to access data from other stages
|
||||
'''
|
||||
self.control = control
|
||||
self.color = False
|
||||
|
||||
def command_line_args( self, args ):
|
||||
'''
|
||||
Group parser for command line arguments
|
||||
|
||||
@param args: Name space of processed arguments
|
||||
'''
|
||||
print( "{0} '{1}' '{2}' has not been implemented yet"
|
||||
.format(
|
||||
WARNING,
|
||||
self.command_line_args.__name__,
|
||||
type( self ).__name__
|
||||
)
|
||||
)
|
||||
|
||||
def command_line_flags( self, parser ):
|
||||
'''
|
||||
Group parser for command line options
|
||||
|
||||
@param parser: argparse setup object
|
||||
'''
|
||||
print( "{0} '{1}' '{2}' has not been implemented yet"
|
||||
.format(
|
||||
WARNING,
|
||||
self.command_line_flags.__name__,
|
||||
type( self ).__name__
|
||||
)
|
||||
)
|
||||
|
||||
def process( self ):
|
||||
'''
|
||||
Emitter Processing
|
||||
'''
|
||||
print( "{0} '{1}' '{2}' has not been implemented yet"
|
||||
.format(
|
||||
WARNING,
|
||||
self.process.__name__,
|
||||
type( self ).__name__
|
||||
)
|
||||
)
|
||||
|
||||
def output( self ):
|
||||
'''
|
||||
Final Stage of Emitter
|
||||
|
||||
Generate desired outputs
|
||||
'''
|
||||
print( "{0} '{1}' '{2}' has not been implemented yet"
|
||||
.format(
|
||||
WARNING,
|
||||
self.output.__name__,
|
||||
type( self ).__name__
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class TextEmitter:
|
||||
'''
|
||||
KLL Text Emitter Class
|
||||
|
||||
Base class for any text emitter that wants to use the templating functionality
|
||||
|
||||
If multiple files need to be generated, call load_template and generate multiple times.
|
||||
e.g.
|
||||
load_template('_myfile.h')
|
||||
generate('/tmp/myfile.h')
|
||||
|
||||
load_template('_myfile2.h')
|
||||
generate('/tmp/myfile2.h')
|
||||
|
||||
TODO
|
||||
- Generate list of unused tags
|
||||
'''
|
||||
def __init__( self ):
|
||||
'''
|
||||
TextEmitter Initialization
|
||||
'''
|
||||
# Dictionary used to do template replacements
|
||||
self.fill_dict = {}
|
||||
self.tag_list = []
|
||||
|
||||
self.template = None
|
||||
|
||||
def load_template( self, template ):
|
||||
'''
|
||||
Loads template file
|
||||
|
||||
Looks for <|tags|> to replace in the template
|
||||
|
||||
@param template: Path to template
|
||||
'''
|
||||
|
||||
# Does template exist?
|
||||
if not os.path.isfile( template ):
|
||||
print ( "{0} '{1}' does not exist...".format( ERROR, template ) )
|
||||
sys.exit( 1 )
|
||||
|
||||
self.template = template
|
||||
|
||||
# Generate list of fill tags
|
||||
with open( template, 'r' ) as openFile:
|
||||
for line in openFile:
|
||||
match = re.findall( r'<\|([^|>]+)\|>', line )
|
||||
for item in match:
|
||||
self.tag_list.append( item )
|
||||
|
||||
def generate( self, output_path ):
|
||||
'''
|
||||
Generates the output file from the template file
|
||||
|
||||
@param output_path: Path to the generated file
|
||||
'''
|
||||
# Make sure we've called load_template at least once
|
||||
if self.template is None:
|
||||
print ( "{0} TextEmitter template (load_template) has not been called.".format( ERROR ) )
|
||||
sys.exit( 1 )
|
||||
|
||||
# Process each line of the template, outputting to the target path
|
||||
with open( output_path, 'w' ) as outputFile:
|
||||
with open( self.template, 'r' ) as templateFile:
|
||||
for line in templateFile:
|
||||
# TODO Support multiple replacements per line
|
||||
# TODO Support replacement with other text inline
|
||||
match = re.findall( r'<\|([^|>]+)\|>', line )
|
||||
|
||||
# If match, replace with processed variable
|
||||
if match:
|
||||
try:
|
||||
outputFile.write( self.fill_dict[ match[0] ] )
|
||||
except KeyError as err:
|
||||
print( "{0} '{1}' not found, skipping...".format(
|
||||
WARNING, match[0]
|
||||
) )
|
||||
outputFile.write("\n")
|
||||
|
||||
# Otherwise, just append template to output file
|
||||
else:
|
||||
outputFile.write( line )
|
||||
|
630
common/expression.py
Normal file
630
common/expression.py
Normal file
@ -0,0 +1,630 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Expression Container
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import copy
|
||||
|
||||
from common.id import CapId
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Expression:
|
||||
'''
|
||||
Container class for KLL expressions
|
||||
'''
|
||||
def __init__( self, lparam, operator, rparam, context ):
|
||||
'''
|
||||
Initialize expression container
|
||||
|
||||
@param lparam: LOperatorData token
|
||||
@param operator: Operator token
|
||||
@param rparam: ROperatorData token
|
||||
@param context: Parent context of expression
|
||||
'''
|
||||
# First stage/init
|
||||
self.lparam_token = lparam
|
||||
self.operator_token = operator
|
||||
self.rparam_token = rparam
|
||||
self.context = context # TODO, set multiple contexts for later stages
|
||||
|
||||
# Second stage
|
||||
self.lparam_sub_tokens = []
|
||||
self.rparam_sub_tokens = []
|
||||
|
||||
# Mutate class into the desired type
|
||||
self.__class__ = {
|
||||
'=>' : NameAssociationExpression,
|
||||
'<=' : DataAssociationExpression,
|
||||
'=' : AssignmentExpression,
|
||||
':' : MapExpression,
|
||||
}[ self.operator_type() ]
|
||||
|
||||
def operator_type( self ):
|
||||
'''
|
||||
Determine which base operator this operator is of
|
||||
|
||||
All : (map) expressions are tokenized/parsed the same way
|
||||
|
||||
@return Base string representation of the operator
|
||||
'''
|
||||
if ':' in self.operator_token.value:
|
||||
return ':'
|
||||
|
||||
return self.operator_token.value
|
||||
|
||||
|
||||
def final_tokens( self, no_filter=False ):
|
||||
'''
|
||||
Return the final list of tokens, must complete the second stage first
|
||||
|
||||
@param no_filter: If true, do not filter out Space tokens
|
||||
|
||||
@return Finalized list of tokens
|
||||
'''
|
||||
ret = self.lparam_sub_tokens + [ self.operator_token ] + self.rparam_sub_tokens
|
||||
|
||||
if not no_filter:
|
||||
ret = [ x for x in ret if x.type != 'Space' ]
|
||||
return ret
|
||||
|
||||
def regen_str( self ):
|
||||
'''
|
||||
Re-construct the string based off the original set of tokens
|
||||
|
||||
<lparam><operator><rparam>;
|
||||
'''
|
||||
return "{0}{1}{2};".format(
|
||||
self.lparam_token.value,
|
||||
self.operator_token.value,
|
||||
self.rparam_token.value,
|
||||
)
|
||||
|
||||
def point_chars( self, pos_list ):
|
||||
'''
|
||||
Using the regenerated string, point to a given list of characters
|
||||
Used to indicate where a possible issue/syntax error is
|
||||
|
||||
@param pos_list: List of character indices
|
||||
|
||||
i.e.
|
||||
> U"A" : : U"1";
|
||||
> ^
|
||||
'''
|
||||
out = "\t{0}\n\t".format( self.regen_str() )
|
||||
|
||||
# Place a ^ character at the given locations
|
||||
curpos = 1
|
||||
for pos in sorted( pos_list ):
|
||||
# Pad spaces, then add a ^
|
||||
out += ' ' * (pos - curpos)
|
||||
out += '^'
|
||||
curpos += pos
|
||||
|
||||
return out
|
||||
|
||||
def rparam_start( self ):
|
||||
'''
|
||||
Starting positing char of rparam_token in a regen_str
|
||||
'''
|
||||
return len( self.lparam_token.value ) + len( self.operator_token.value )
|
||||
|
||||
def __repr__( self ):
|
||||
# Build string representation based off of what has been set
|
||||
# lparam, operator and rparam are always set
|
||||
out = "Expression: {0}{1}{2}".format(
|
||||
self.lparam_token.value,
|
||||
self.operator_token.value,
|
||||
self.rparam_token.value,
|
||||
)
|
||||
|
||||
# TODO - Add more depending on what has been set
|
||||
return out
|
||||
|
||||
def unique_keys( self ):
|
||||
'''
|
||||
Generates a list of unique identifiers for the expression that is mergeable
|
||||
with other functional equivalent expressions.
|
||||
|
||||
This method should never get called directly as a generic Expression
|
||||
'''
|
||||
return [ ('UNKNOWN KEY', 'UNKNOWN EXPRESSION') ]
|
||||
|
||||
|
||||
class AssignmentExpression( Expression ):
|
||||
'''
|
||||
Container class for assignment KLL expressions
|
||||
'''
|
||||
type = None
|
||||
name = None
|
||||
pos = None
|
||||
value = None
|
||||
|
||||
## Setters ##
|
||||
def array( self, name, pos, value ):
|
||||
'''
|
||||
Assign array assignment parameters to expression
|
||||
|
||||
@param name: Name of variable
|
||||
@param pos: Array position of the value (if None, overwrite the entire array)
|
||||
@param value: Value of the array, if pos is specified, this is the value of an element
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Array'
|
||||
self.name = name
|
||||
self.pos = pos
|
||||
self.value = value
|
||||
|
||||
# If pos is not none, flatten
|
||||
if pos is not None:
|
||||
self.value = "".join( str( x ) for x in self.value )
|
||||
|
||||
return True
|
||||
|
||||
def variable( self, name, value ):
|
||||
'''
|
||||
Assign variable assignment parameters to expression
|
||||
|
||||
@param name: Name of variable
|
||||
@param value: Value of variable
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Variable'
|
||||
self.name = name
|
||||
self.value = value
|
||||
|
||||
# Flatten value, often a list of various token types
|
||||
self.value = "".join( str( x ) for x in self.value )
|
||||
|
||||
return True
|
||||
|
||||
def __repr__( self ):
|
||||
if self.type == 'Variable':
|
||||
return "{0} = {1};".format( self.name, self.value )
|
||||
elif self.type == 'Array':
|
||||
return "{0}[{1}] = {2};".format( self.name, self.pos, self.value )
|
||||
|
||||
return "ASSIGNMENT UNKNOWN"
|
||||
|
||||
def unique_keys( self ):
|
||||
'''
|
||||
Generates a list of unique identifiers for the expression that is mergeable
|
||||
with other functional equivalent expressions.
|
||||
'''
|
||||
return [ ( self.name, self ) ]
|
||||
|
||||
|
||||
class NameAssociationExpression( Expression ):
|
||||
'''
|
||||
Container class for name association KLL expressions
|
||||
'''
|
||||
type = None
|
||||
name = None
|
||||
association = None
|
||||
|
||||
## Setters ##
|
||||
def capability( self, name, association, parameters ):
|
||||
'''
|
||||
Assign a capability C function name association
|
||||
|
||||
@param name: Name of capability
|
||||
@param association: Name of capability in target backend output
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Capability'
|
||||
self.name = name
|
||||
self.association = CapId( association, 'Definition', parameters )
|
||||
|
||||
return True
|
||||
|
||||
def define( self, name, association ):
|
||||
'''
|
||||
Assign a define C define name association
|
||||
|
||||
@param name: Name of variable
|
||||
@param association: Name of association in target backend output
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Define'
|
||||
self.name = name
|
||||
self.association = association
|
||||
|
||||
return True
|
||||
|
||||
def __repr__( self ):
|
||||
return "{0} <= {1};".format( self.name, self.association )
|
||||
|
||||
def unique_keys( self ):
|
||||
'''
|
||||
Generates a list of unique identifiers for the expression that is mergeable
|
||||
with other functional equivalent expressions.
|
||||
'''
|
||||
return [ ( self.name, self ) ]
|
||||
|
||||
|
||||
class DataAssociationExpression( Expression ):
|
||||
'''
|
||||
Container class for data association KLL expressions
|
||||
'''
|
||||
type = None
|
||||
association = None
|
||||
value = None
|
||||
|
||||
## Setters ##
|
||||
def animation( self, animations, animation_modifiers ):
|
||||
'''
|
||||
Animation definition and configuration
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Animation'
|
||||
self.association = animations
|
||||
self.value = animation_modifiers
|
||||
|
||||
return True
|
||||
|
||||
def animationFrame( self, animation_frames, pixel_modifiers ):
|
||||
'''
|
||||
Pixel composition of an Animation Frame
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
|
||||
self.type = 'AnimationFrame'
|
||||
self.association = animation_frames
|
||||
self.value = pixel_modifiers
|
||||
|
||||
return True
|
||||
|
||||
def pixelPosition( self, pixels, position ):
|
||||
'''
|
||||
Pixel Positioning
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
for pixel in pixels:
|
||||
pixel.setPosition( position )
|
||||
|
||||
self.type = 'PixelPosition'
|
||||
self.association = pixels
|
||||
|
||||
return True
|
||||
|
||||
def scanCodePosition( self, scancodes, position ):
|
||||
'''
|
||||
Scan Code to Position Mapping
|
||||
|
||||
Note: Accepts lists of scan codes
|
||||
Alone this isn't useful, but you can assign rows and columns using ranges instead of individually
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
for scancode in scancodes:
|
||||
scancode.setPosition( position )
|
||||
|
||||
self.type = 'ScanCodePosition'
|
||||
self.association = scancodes
|
||||
|
||||
return True
|
||||
|
||||
def __repr__( self ):
|
||||
if self.type in ['PixelPosition', 'ScanCodePosition']:
|
||||
output = ""
|
||||
for index, association in enumerate( self.association ):
|
||||
if index > 0:
|
||||
output += "; "
|
||||
output += "{0}".format( association )
|
||||
return "{0};".format( output )
|
||||
return "{0} <= {1};".format( self.association, self.value )
|
||||
|
||||
def unique_keys( self ):
|
||||
'''
|
||||
Generates a list of unique identifiers for the expression that is mergeable
|
||||
with other functional equivalent expressions.
|
||||
'''
|
||||
keys = []
|
||||
|
||||
# Positions require a bit more introspection to get the unique keys
|
||||
if self.type in ['PixelPosition', 'ScanCodePosition']:
|
||||
for index, key in enumerate( self.association ):
|
||||
uniq_expr = self
|
||||
|
||||
# If there is more than one key, copy the expression
|
||||
# and remove the non-related variants
|
||||
if len( self.association ) > 1:
|
||||
uniq_expr = copy.copy( self )
|
||||
|
||||
# Isolate variant by index
|
||||
uniq_expr.association = [ uniq_expr.association[ index ] ]
|
||||
|
||||
keys.append( ( "{0}".format( key.unique_key() ), uniq_expr ) )
|
||||
|
||||
# AnimationFrames are already list of keys
|
||||
# TODO Reorder frame assignments to dedup function equivalent mappings
|
||||
elif self.type in ['AnimationFrame']:
|
||||
for index, key in enumerate( self.association ):
|
||||
uniq_expr = self
|
||||
|
||||
# If there is more than one key, copy the expression
|
||||
# and remove the non-related variants
|
||||
if len( self.association ) > 1:
|
||||
uniq_expr = copy.copy( self )
|
||||
|
||||
# Isolate variant by index
|
||||
uniq_expr.association = [ uniq_expr.association[ index ] ]
|
||||
|
||||
keys.append( ( "{0}".format( key ), uniq_expr ) )
|
||||
|
||||
# Otherwise treat as a single element
|
||||
else:
|
||||
keys = [ ( "{0}".format( self.association ), self ) ]
|
||||
|
||||
# Remove any duplicate keys
|
||||
# TODO Stat? Might be at neat report about how many duplicates were squashed
|
||||
keys = list( set( keys ) )
|
||||
|
||||
return keys
|
||||
|
||||
|
||||
class MapExpression( Expression ):
|
||||
'''
|
||||
Container class for KLL map expressions
|
||||
'''
|
||||
type = None
|
||||
triggers = None
|
||||
operator = None
|
||||
results = None
|
||||
animation = None
|
||||
animation_frame = None
|
||||
pixels = None
|
||||
position = None
|
||||
|
||||
## Setters ##
|
||||
def scanCode( self, triggers, operator, results ):
|
||||
'''
|
||||
Scan Code mapping
|
||||
|
||||
@param triggers: Sequence of combos of ranges of namedtuples
|
||||
@param operator: Type of map operation
|
||||
@param results: Sequence of combos of ranges of namedtuples
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'ScanCode'
|
||||
self.triggers = triggers
|
||||
self.operator = operator
|
||||
self.results = results
|
||||
|
||||
return True
|
||||
|
||||
def usbCode( self, triggers, operator, results ):
|
||||
'''
|
||||
USB Code mapping
|
||||
|
||||
@param triggers: Sequence of combos of ranges of namedtuples
|
||||
@param operator: Type of map operation
|
||||
@param results: Sequence of combos of ranges of namedtuples
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'USBCode'
|
||||
self.triggers = triggers
|
||||
self.operator = operator
|
||||
self.results = results
|
||||
|
||||
return True
|
||||
|
||||
def animationTrigger( self, animation, operator, results ):
|
||||
'''
|
||||
Animation Trigger mapping
|
||||
|
||||
@param animation: Animation trigger of result
|
||||
@param operator: Type of map operation
|
||||
@param results: Sequence of combos of ranges of namedtuples
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'Animation'
|
||||
self.animation = animation
|
||||
self.triggers = animation
|
||||
self.operator = operator
|
||||
self.results = results
|
||||
|
||||
return True
|
||||
|
||||
def pixelChannels( self, pixelmap, trigger ):
|
||||
'''
|
||||
Pixel Channel Composition
|
||||
|
||||
@return: True if parsing was successful
|
||||
'''
|
||||
self.type = 'PixelChannel'
|
||||
self.pixel = pixelmap
|
||||
self.position = trigger
|
||||
|
||||
return True
|
||||
|
||||
def sequencesOfCombosOfIds( self, expression_param ):
|
||||
'''
|
||||
Prettified Sequence of Combos of Identifiers
|
||||
|
||||
@param expression_param: Trigger or Result parameter of an expression
|
||||
|
||||
Scan Code Example
|
||||
[[[S10, S16], [S42]], [[S11, S16], [S42]]] -> (S10 + S16, S42)|(S11 + S16, S42)
|
||||
'''
|
||||
output = ""
|
||||
|
||||
# Sometimes during error cases, might be None
|
||||
if expression_param is None:
|
||||
return output
|
||||
|
||||
# Iterate over each trigger/result variants (expanded from ranges), each one is a sequence
|
||||
for index, sequence in enumerate( expression_param ):
|
||||
if index > 0:
|
||||
output += "|"
|
||||
output += "("
|
||||
|
||||
# Iterate over each combo (element of the sequence)
|
||||
for index, combo in enumerate( sequence ):
|
||||
if index > 0:
|
||||
output += ", "
|
||||
|
||||
# Iterate over each trigger identifier
|
||||
for index, identifier in enumerate( combo ):
|
||||
if index > 0:
|
||||
output += " + "
|
||||
output += "{0}".format( identifier )
|
||||
|
||||
output += ")"
|
||||
|
||||
return output
|
||||
|
||||
def elems( self ):
|
||||
'''
|
||||
Return number of trigger and result elements
|
||||
|
||||
Useful for determining if this is a trigger macro (2+)
|
||||
Should always return at least (1,1) unless it's an invalid calculation
|
||||
|
||||
@return: ( triggers, results )
|
||||
'''
|
||||
elems = [ 0, 0 ]
|
||||
|
||||
# XXX Needed?
|
||||
if self.type == 'PixelChannel':
|
||||
return tuple( elems )
|
||||
|
||||
# Iterate over each trigger variant (expanded from ranges), each one is a sequence
|
||||
for sequence in self.triggers:
|
||||
# Iterate over each combo (element of the sequence)
|
||||
for combo in sequence:
|
||||
# Just measure the size of the combo
|
||||
elems[0] += len( combo )
|
||||
|
||||
# Iterate over each result variant (expanded from ranges), each one is a sequence
|
||||
for sequence in self.results:
|
||||
# Iterate over each combo (element of the sequence)
|
||||
for combo in sequence:
|
||||
# Just measure the size of the combo
|
||||
elems[1] += len( combo )
|
||||
|
||||
return tuple( elems )
|
||||
|
||||
def trigger_str( self ):
|
||||
'''
|
||||
String version of the trigger
|
||||
Used for sorting
|
||||
'''
|
||||
# Pixel Channel Mapping doesn't follow the same pattern
|
||||
if self.type == 'PixelChannel':
|
||||
return "{0}".format( self.pixel )
|
||||
|
||||
return "{0}".format(
|
||||
self.sequencesOfCombosOfIds( self.triggers ),
|
||||
)
|
||||
|
||||
def result_str( self ):
|
||||
'''
|
||||
String version of the result
|
||||
Used for sorting
|
||||
'''
|
||||
# Pixel Channel Mapping doesn't follow the same pattern
|
||||
if self.type == 'PixelChannel':
|
||||
return "{0}".format( self.position )
|
||||
|
||||
return "{0}".format(
|
||||
self.sequencesOfCombosOfIds( self.results ),
|
||||
)
|
||||
|
||||
def __repr__( self ):
|
||||
# Pixel Channel Mapping doesn't follow the same pattern
|
||||
if self.type == 'PixelChannel':
|
||||
return "{0} : {1};".format( self.pixel, self.position )
|
||||
|
||||
return "{0} {1} {2};".format(
|
||||
self.sequencesOfCombosOfIds( self.triggers ),
|
||||
self.operator,
|
||||
self.sequencesOfCombosOfIds( self.results ),
|
||||
)
|
||||
|
||||
def unique_keys( self ):
|
||||
'''
|
||||
Generates a list of unique identifiers for the expression that is mergeable
|
||||
with other functional equivalent expressions.
|
||||
|
||||
TODO: This function should re-order combinations to generate the key.
|
||||
The final generated combo will be in the original order.
|
||||
'''
|
||||
keys = []
|
||||
|
||||
# Pixel Channel only has key per mapping
|
||||
if self.type == 'PixelChannel':
|
||||
keys = [ ( "{0}".format( self.pixel ), self ) ]
|
||||
|
||||
# Split up each of the keys
|
||||
else:
|
||||
# Iterate over each trigger/result variants (expanded from ranges), each one is a sequence
|
||||
for index, sequence in enumerate( self.triggers ):
|
||||
key = ""
|
||||
uniq_expr = self
|
||||
|
||||
# If there is more than one key, copy the expression
|
||||
# and remove the non-related variants
|
||||
if len( self.triggers ) > 1:
|
||||
uniq_expr = copy.copy( self )
|
||||
|
||||
# Isolate variant by index
|
||||
uniq_expr.triggers = [ uniq_expr.triggers[ index ] ]
|
||||
|
||||
# Iterate over each combo (element of the sequence)
|
||||
for index, combo in enumerate( sequence ):
|
||||
if index > 0:
|
||||
key += ", "
|
||||
|
||||
# Iterate over each trigger identifier
|
||||
for index, identifier in enumerate( combo ):
|
||||
if index > 0:
|
||||
key += " + "
|
||||
key += "{0}".format( identifier )
|
||||
|
||||
# Add key to list
|
||||
keys.append( ( key, uniq_expr ) )
|
||||
|
||||
# Remove any duplicate keys
|
||||
# TODO Stat? Might be at neat report about how many duplicates were squashed
|
||||
keys = list( set( keys ) )
|
||||
|
||||
return keys
|
||||
|
94
common/file.py
Normal file
94
common/file.py
Normal file
@ -0,0 +1,94 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL File Container
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import os
|
||||
|
||||
import common.context as context
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class KLLFile:
|
||||
'''
|
||||
Container class for imported KLL files
|
||||
'''
|
||||
def __init__( self, path, file_context ):
|
||||
'''
|
||||
Initialize file container
|
||||
|
||||
@param path: Path to filename, if relative, relative to the execution environment
|
||||
@param context: KLL Context object
|
||||
'''
|
||||
self.path = path
|
||||
self.context = file_context
|
||||
self.lines = []
|
||||
self.data = ""
|
||||
|
||||
def __repr__( self ):
|
||||
context_str = type( self.context ).__name__
|
||||
|
||||
# Show layer info if this is a PartialMap
|
||||
if isinstance( self.context, context.PartialMapContext ):
|
||||
context_str = "{0}({1})".format( context_str, self.context.layer )
|
||||
|
||||
return "({0}, {1})".format( self.path, context_str )
|
||||
|
||||
def check( self ):
|
||||
'''
|
||||
Make sure that the file exists at the initialized path
|
||||
'''
|
||||
exists = os.path.isfile( self.path )
|
||||
|
||||
# Display error message, will exit later
|
||||
if not exists:
|
||||
print( "{0} {1} does not exist...".format( ERROR, self.path ) )
|
||||
|
||||
return exists
|
||||
|
||||
def read( self ):
|
||||
'''
|
||||
Read the contents of the file path into memory
|
||||
Reads both per line and complete copies
|
||||
'''
|
||||
try:
|
||||
# Read file into memory, removing newlines
|
||||
with open( self.path ) as f:
|
||||
self.data = f.read()
|
||||
self.lines = self.data.splitlines()
|
||||
|
||||
except:
|
||||
print( "{0} Failed to read '{1}' into memory...".format( ERROR, self.path ) )
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
|
||||
|
1583
common/hid_dict.py
Normal file
1583
common/hid_dict.py
Normal file
File diff suppressed because it is too large
Load Diff
280
common/id.py
Normal file
280
common/id.py
Normal file
@ -0,0 +1,280 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Id Containers
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
from common.hid_dict import hid_lookup_dictionary
|
||||
|
||||
from common.channel import ChannelList
|
||||
from common.modifier import AnimationModifierList, PixelModifierList
|
||||
from common.position import Position
|
||||
from common.schedule import Schedule
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Id:
|
||||
'''
|
||||
Base container class for various KLL types
|
||||
'''
|
||||
def __init__( self ):
|
||||
self.type = None
|
||||
self.uid = None
|
||||
|
||||
|
||||
class HIDId( Id, Schedule ):
|
||||
'''
|
||||
HID/USB identifier container class
|
||||
'''
|
||||
secondary_types = {
|
||||
'USBCode' : 'USB',
|
||||
'SysCode' : 'SYS',
|
||||
'ConsCode' : 'CONS',
|
||||
'IndCode' : 'IND',
|
||||
}
|
||||
|
||||
def __init__( self, type, uid ):
|
||||
'''
|
||||
@param type: String type of the Id
|
||||
@param uid: Unique integer identifier for the Id
|
||||
'''
|
||||
Id.__init__( self )
|
||||
Schedule.__init__( self )
|
||||
self.type = type
|
||||
self.uid = uid
|
||||
|
||||
# Set secondary type
|
||||
self.second_type = self.secondary_types[ self.type ]
|
||||
|
||||
# TODO Validate uid to make sure it's in the lookup dictionary
|
||||
# TODO Validate HID specifier
|
||||
#print ( "{0} Unknown HID Specifier '{1}'".format( ERROR, type ) )
|
||||
#raise
|
||||
|
||||
def __repr__( self ):
|
||||
'''
|
||||
Use string name instead of integer, easier to debug
|
||||
'''
|
||||
uid = hid_lookup_dictionary[ ( self.second_type, self.uid ) ]
|
||||
schedule = self.strSchedule()
|
||||
if len( schedule ) > 0:
|
||||
schedule = "({0})".format( schedule )
|
||||
|
||||
output = 'HID({0})"{1}"{2}'.format( self.type, uid, schedule )
|
||||
return output
|
||||
|
||||
|
||||
class ScanCodeId( Id, Schedule, Position ):
|
||||
'''
|
||||
Scan Code identifier container class
|
||||
'''
|
||||
|
||||
def __init__( self, uid ):
|
||||
Id.__init__( self )
|
||||
Schedule.__init__( self )
|
||||
Position.__init__( self )
|
||||
self.type = 'ScanCode'
|
||||
self.uid = uid
|
||||
|
||||
# By default, interconnect_id of 0
|
||||
# Will be set during the merge process if it needs to change
|
||||
self.interconnect_id = 0
|
||||
|
||||
def unique_key( self ):
|
||||
'''
|
||||
Returns the key string used for datastructure sorting
|
||||
'''
|
||||
# Positions are a special case
|
||||
if self.positionSet():
|
||||
return "P{0}".format( self.uid )
|
||||
|
||||
def __repr__( self ):
|
||||
# Positions are a special case
|
||||
if self.positionSet():
|
||||
return "{0} <= {1}".format( self.unique_key(), self.strPosition() )
|
||||
|
||||
schedule = self.strSchedule()
|
||||
if len( schedule ) > 0:
|
||||
return "S{0}({1})".format( self.uid, schedule )
|
||||
else:
|
||||
return "S{0}".format( self.uid )
|
||||
|
||||
|
||||
class AnimationId( Id, AnimationModifierList ):
|
||||
'''
|
||||
Animation identifier container class
|
||||
'''
|
||||
name = None
|
||||
|
||||
def __init__( self, name ):
|
||||
Id.__init__( self )
|
||||
AnimationModifierList.__init__( self )
|
||||
self.name = name
|
||||
self.type = 'Animation'
|
||||
|
||||
def __repr__( self ):
|
||||
if len( self.modifiers ) > 0:
|
||||
return "A[{0}]({1})".format( self.name, self.strModifiers() )
|
||||
return "A[{0}]".format( self.name )
|
||||
|
||||
|
||||
class AnimationFrameId( Id, AnimationModifierList ):
|
||||
'''
|
||||
Animation Frame identifier container class
|
||||
'''
|
||||
def __init__( self, name, index ):
|
||||
Id.__init__( self )
|
||||
AnimationModifierList.__init__( self )
|
||||
self.name = name
|
||||
self.index = index
|
||||
self.type = 'AnimationFrame'
|
||||
|
||||
def __repr__( self ):
|
||||
return "AF[{0}, {1}]".format( self.name, self.index )
|
||||
|
||||
|
||||
class PixelId( Id, Position, PixelModifierList, ChannelList ):
|
||||
'''
|
||||
Pixel identifier container class
|
||||
'''
|
||||
def __init__( self, uid ):
|
||||
Id.__init__( self )
|
||||
Position.__init__( self )
|
||||
PixelModifierList.__init__( self )
|
||||
ChannelList.__init__( self )
|
||||
self.uid = uid
|
||||
self.type = 'Pixel'
|
||||
|
||||
def unique_key( self ):
|
||||
'''
|
||||
Returns the key string used for datastructure sorting
|
||||
'''
|
||||
return "P{0}".format( self.uid )
|
||||
|
||||
def __repr__( self ):
|
||||
# Positions are a special case
|
||||
if self.positionSet():
|
||||
return "{0} <= {1}".format( self.unique_key(), self.strPosition() )
|
||||
|
||||
extra = ""
|
||||
if len( self.modifiers ) > 0:
|
||||
extra += "({0})".format( self.strModifiers() )
|
||||
if len( self.channels ) > 0:
|
||||
extra += "({0})".format( self.strChannels() )
|
||||
return "{0}{1}".format( self.unique_key(), extra )
|
||||
|
||||
|
||||
class PixelLayerId( Id, PixelModifierList ):
|
||||
'''
|
||||
Pixel Layer identifier container class
|
||||
'''
|
||||
def __init__( self, uid ):
|
||||
Id.__init__( self )
|
||||
PixelModifierList.__init__( self )
|
||||
self.uid = uid
|
||||
self.type = 'PixelLayer'
|
||||
|
||||
def __repr__( self ):
|
||||
if len( self.modifiers ) > 0:
|
||||
return "PL{0}({1})".format( self.uid, self.strModifiers() )
|
||||
return "PL{0}".format( self.uid )
|
||||
|
||||
|
||||
class CapId( Id ):
|
||||
'''
|
||||
Capability identifier
|
||||
'''
|
||||
def __init__( self, name, type, arg_list=[] ):
|
||||
'''
|
||||
@param name: Name of capability
|
||||
@param type: Type of capability definition, string
|
||||
@param arg_list: List of CapArgIds, empty list if there are none
|
||||
'''
|
||||
Id.__init__( self )
|
||||
self.name = name
|
||||
self.type = type
|
||||
self.arg_list = arg_list
|
||||
|
||||
def __repr__( self ):
|
||||
# Generate prettified argument list
|
||||
arg_string = ""
|
||||
for arg in self.arg_list:
|
||||
arg_string += "{0},".format( arg )
|
||||
if len( arg_string ) > 0:
|
||||
arg_string = arg_string[:-1]
|
||||
|
||||
return "{0}({1})".format( self.name, arg_string )
|
||||
|
||||
def total_arg_bytes( self ):
|
||||
'''
|
||||
Calculate the total number of bytes needed for the args
|
||||
|
||||
return: Number of bytes
|
||||
'''
|
||||
# Zero if no args
|
||||
total_bytes = 0
|
||||
for arg in self.arg_list:
|
||||
total_bytes += arg.width
|
||||
|
||||
return total_bytes
|
||||
|
||||
|
||||
class NoneId( CapId ):
|
||||
'''
|
||||
None identifier
|
||||
|
||||
It's just a capability...that does nothing (instead of infering to do something else)
|
||||
'''
|
||||
def __init__( self ):
|
||||
super().__init__( 'None', 'None' )
|
||||
|
||||
def __repr__( self ):
|
||||
return "None"
|
||||
|
||||
|
||||
class CapArgId( Id ):
|
||||
'''
|
||||
Capability Argument identifier
|
||||
'''
|
||||
def __init__( self, name, width=None ):
|
||||
'''
|
||||
@param name: Name of argument
|
||||
@param width: Byte-width of the argument, if None, this is not port of a capability definition
|
||||
'''
|
||||
Id.__init__( self )
|
||||
self.name = name
|
||||
self.width = width
|
||||
self.type = 'CapArg'
|
||||
|
||||
def __repr__( self ):
|
||||
if self.width is None:
|
||||
return "{0}".format( self.name )
|
||||
else:
|
||||
return "{0}:{1}".format( self.name, self.width )
|
||||
|
126
common/modifier.py
Normal file
126
common/modifier.py
Normal file
@ -0,0 +1,126 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Modifier Containers
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class AnimationModifier:
|
||||
'''
|
||||
Animation modification container class
|
||||
'''
|
||||
def __init__( self, name, value=None ):
|
||||
self.name = name
|
||||
self.value = value
|
||||
|
||||
def __repr__( self ):
|
||||
if self.value is None:
|
||||
return "{0}".format( self.name )
|
||||
return "{0}:{1}".format( self.name, self.value )
|
||||
|
||||
|
||||
class AnimationModifierList:
|
||||
'''
|
||||
Animation modification container list class
|
||||
|
||||
Contains a list of modifiers, the order does not matter
|
||||
'''
|
||||
def __init__( self ):
|
||||
self.modifiers = []
|
||||
|
||||
def setModifiers( self, modifier_list ):
|
||||
'''
|
||||
Apply modifiers to Animation
|
||||
'''
|
||||
for modifier in modifier_list:
|
||||
self.modifiers.append( AnimationModifier( modifier[0], modifier[1] ) )
|
||||
|
||||
def strModifiers( self ):
|
||||
'''
|
||||
__repr__ of Position when multiple inheritance is used
|
||||
'''
|
||||
output = ""
|
||||
for index, modifier in enumerate( self.modifiers ):
|
||||
if index > 0:
|
||||
output += ","
|
||||
output += "{0}".format( modifier )
|
||||
|
||||
return output
|
||||
|
||||
def __repr__( self ):
|
||||
return self.strModifiers()
|
||||
|
||||
|
||||
class PixelModifier:
|
||||
'''
|
||||
Pixel modification container class
|
||||
'''
|
||||
def __init__( self, operator, value ):
|
||||
self.operator = operator
|
||||
self.value = value
|
||||
|
||||
def __repr__( self ):
|
||||
if self.operator is None:
|
||||
return "{0}".format( self.value )
|
||||
return "{0}{1}".format( self.operator, self.value )
|
||||
|
||||
|
||||
class PixelModifierList:
|
||||
'''
|
||||
Pixel modification container list class
|
||||
|
||||
Contains a list of modifiers
|
||||
Index 0, corresponds to pixel 0
|
||||
'''
|
||||
def __init__( self ):
|
||||
self.modifiers = []
|
||||
|
||||
def setModifiers( self, modifier_list ):
|
||||
'''
|
||||
Apply modifier to each pixel channel
|
||||
'''
|
||||
for modifier in modifier_list:
|
||||
self.modifiers.append( PixelModifier( modifier[0], modifier[1] ) )
|
||||
|
||||
def strModifiers( self ):
|
||||
'''
|
||||
__repr__ of Position when multiple inheritance is used
|
||||
'''
|
||||
output = ""
|
||||
for index, modifier in enumerate( self.modifiers ):
|
||||
if index > 0:
|
||||
output += ","
|
||||
output += "{0}".format( modifier )
|
||||
|
||||
return output
|
||||
|
||||
def __repr__( self ):
|
||||
return self.strModifiers()
|
||||
|
614
common/organization.py
Normal file
614
common/organization.py
Normal file
@ -0,0 +1,614 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Data Organization
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import re
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
ansi_escape = re.compile(r'\x1b[^m]*m')
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Data:
|
||||
'''
|
||||
Base class for KLL datastructures
|
||||
'''
|
||||
# Debug output formatters
|
||||
debug_output = {
|
||||
'add' : "\t\033[1;42;37m++\033[0m\033[1mADD KEY\033[1;42;37m++\033[0m \033[1m<==\033[0m {0}",
|
||||
'app' : "\t\033[1;45;37m**\033[0m\033[1mAPP KEY\033[1;45;37m**\033[0m \033[1m<==\033[0m {0}",
|
||||
'mod' : "\t\033[1;44;37m##\033[0m\033[1mMOD KEY\033[1;44;37m##\033[0m \033[1m<==\033[0m {0}",
|
||||
'rem' : "\t\033[1;41;37m--\033[0m\033[1mREM KEY\033[1;41;37m--\033[0m \033[1m<==\033[0m {0}",
|
||||
'drp' : "\t\033[1;43;37m@@\033[0m\033[1mDRP KEY\033[1;43;37m@@\033[0m \033[1m<==\033[0m {0}",
|
||||
'dup' : "\t\033[1;46;37m!!\033[0m\033[1mDUP KEY\033[1;46;37m!!\033[0m \033[1m<==\033[0m {0}",
|
||||
}
|
||||
|
||||
def __init__( self, parent ):
|
||||
'''
|
||||
Initialize datastructure
|
||||
|
||||
@param parent: Parent organization, used to query data from other datastructures
|
||||
'''
|
||||
self.data = {}
|
||||
self.parent = parent
|
||||
|
||||
def add_expression( self, expression, debug ):
|
||||
'''
|
||||
Add expression to data structure
|
||||
|
||||
May have multiple keys to add for a given expression
|
||||
|
||||
@param expression: KLL Expression (fully tokenized and parsed)
|
||||
@param debug: Enable debug output
|
||||
'''
|
||||
# Lookup unique keys for expression
|
||||
keys = expression.unique_keys()
|
||||
|
||||
# Add/Modify expressions in datastructure
|
||||
for key, uniq_expr in keys:
|
||||
# Check which operation we are trying to do, add or modify
|
||||
if debug[0]:
|
||||
if key in self.data.keys():
|
||||
output = self.debug_output['mod'].format( key )
|
||||
else:
|
||||
output = self.debug_output['add'].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
self.data[ key ] = uniq_expr
|
||||
|
||||
def merge( self, merge_in, debug ):
|
||||
'''
|
||||
Merge in the given datastructure to this datastructure
|
||||
|
||||
This datastructure serves as the base.
|
||||
|
||||
@param merge_in: Data structure from another organization to merge into this one
|
||||
@param debug: Enable debug out
|
||||
'''
|
||||
# The default case is just to add the expression in directly
|
||||
for key, kll_expression in merge_in.data.items():
|
||||
# Display key:expression being merged in
|
||||
if debug[0]:
|
||||
output = merge_in.elem_str( key, True )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ), end="" )
|
||||
|
||||
self.add_expression( kll_expression, debug )
|
||||
|
||||
def reduction( self ):
|
||||
'''
|
||||
Simplifies datastructure
|
||||
|
||||
Most of the datastructures don't have a reduction. Just do nothing in this case.
|
||||
'''
|
||||
pass
|
||||
|
||||
def elem_str( self, key, single=False ):
|
||||
'''
|
||||
Debug output for a single element
|
||||
|
||||
@param key: Index to datastructure
|
||||
@param single: Setting to True will bold the key
|
||||
'''
|
||||
if single:
|
||||
return "\033[1;33m{0: <20}\033[0m \033[1;36;41m>\033[0m {1}\n".format( key, self.data[ key ] )
|
||||
else:
|
||||
return "{0: <20} \033[1;36;41m>\033[0m {1}\n".format( key, self.data[ key ] )
|
||||
|
||||
def __repr__( self ):
|
||||
output = ""
|
||||
|
||||
# Display sorted list of keys, along with the internal value
|
||||
for key in sorted( self.data ):
|
||||
output += self.elem_str( key )
|
||||
|
||||
return output
|
||||
|
||||
|
||||
class MappingData( Data ):
|
||||
'''
|
||||
KLL datastructure for data mapping
|
||||
|
||||
ScanCode trigger -> result
|
||||
USBCode trigger -> result
|
||||
Animation trigger -> result
|
||||
'''
|
||||
def add_expression( self, expression, debug ):
|
||||
'''
|
||||
Add expression to data structure
|
||||
|
||||
May have multiple keys to add for a given expression
|
||||
|
||||
Map expressions insert into the datastructure according to their operator.
|
||||
|
||||
+Operators+
|
||||
: Add/Modify
|
||||
:+ Append
|
||||
:- Remove
|
||||
:: Lazy Add/Modify
|
||||
|
||||
i: Add/Modify
|
||||
i:+ Append
|
||||
i:- Remove
|
||||
i:: Lazy Add/Modify
|
||||
|
||||
The i or isolation operators are stored separately from the main ones.
|
||||
Each key is pre-pended with an i
|
||||
|
||||
The :: or lazy operators act just like : operators, except that they will be ignore if the evaluation
|
||||
merge cannot resolve a ScanCode.
|
||||
|
||||
@param expression: KLL Expression (fully tokenized and parsed)
|
||||
@param debug: Enable debug output
|
||||
'''
|
||||
# Lookup unique keys for expression
|
||||
keys = expression.unique_keys()
|
||||
|
||||
# Add/Modify expressions in datastructure
|
||||
for key, uniq_expr in keys:
|
||||
# Determine which the expression operator
|
||||
operator = expression.operator
|
||||
|
||||
# Except for the : operator, all others have delayed action
|
||||
# Meaning, they change behaviour depending on how Contexts are merged
|
||||
# This means we can't simplify yet
|
||||
# In addition, :+ and :- are stackable, which means each key has a list of expressions
|
||||
# We append the operator to differentiate between the different types of delayed operations
|
||||
key = "{0}{1}".format( operator, key )
|
||||
|
||||
# Determine if key exists already
|
||||
exists = key in self.data.keys()
|
||||
|
||||
# Add/Modify
|
||||
if operator in [':', '::', 'i:', 'i::']:
|
||||
debug_tag = exists and 'mod' or 'add'
|
||||
|
||||
# Append/Remove
|
||||
else:
|
||||
# Check to make sure we haven't already appended expression
|
||||
# Use the string representation to do the comparison (general purpose)
|
||||
if exists and "{0}".format( uniq_expr ) in [ "{0}".format( elem ) for elem in self.data[ key ] ]:
|
||||
debug_tag = 'dup'
|
||||
|
||||
# Append
|
||||
elif operator in [':+', 'i:+']:
|
||||
debug_tag = 'app'
|
||||
|
||||
# Remove
|
||||
else:
|
||||
debug_tag = 'rem'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Don't append if a duplicate
|
||||
if debug_tag == 'dup':
|
||||
continue
|
||||
|
||||
# Append, rather than replace
|
||||
if operator in [':+', ':-', 'i:+', 'i:-']:
|
||||
if exists:
|
||||
self.data[ key ].append( uniq_expr )
|
||||
|
||||
# Create initial list
|
||||
else:
|
||||
self.data[ key ] = [ uniq_expr ]
|
||||
else:
|
||||
self.data[ key ] = [ uniq_expr ]
|
||||
|
||||
def set_interconnect_id( self, interconnect_id, triggers ):
|
||||
'''
|
||||
Traverses the sequence of combo of identifiers to set the interconnect_id
|
||||
'''
|
||||
for sequence in triggers:
|
||||
for combo in sequence:
|
||||
for identifier in combo:
|
||||
identifier.interconnect_id = interconnect_id
|
||||
|
||||
def merge( self, merge_in, debug ):
|
||||
'''
|
||||
Merge in the given datastructure to this datastructure
|
||||
|
||||
This datastructure serves as the base.
|
||||
|
||||
Map expressions merge differently than insertions.
|
||||
|
||||
+Operators+
|
||||
: Add/Modify - Replace
|
||||
:+ Append - Add
|
||||
:- Remove - Remove
|
||||
:: Lazy Add/Modify - Replace if found, otherwise drop
|
||||
|
||||
i: Add/Modify - Replace
|
||||
i:+ Append - Add
|
||||
i:- Remove - Remove
|
||||
i:: Lazy Add/Modify - Replace if found, otherwise drop
|
||||
|
||||
@param merge_in: Data structure from another organization to merge into this one
|
||||
@param debug: Enable debug out
|
||||
'''
|
||||
# Check what the current interconnectId is
|
||||
# If not set, we set to 0 (default)
|
||||
# We use this to calculate the scancode during the DataAnalysisStage
|
||||
interconnect_id = 0
|
||||
if 'interconnectId' in self.parent.variable_data.data.keys():
|
||||
interconnect_id = self.parent.variable_data.data['interconnectId']
|
||||
|
||||
# Sort different types of keys
|
||||
cur_keys = merge_in.data.keys()
|
||||
|
||||
# Lazy Set ::
|
||||
lazy_keys = [ key for key in cur_keys if key[0:2] == '::' or key[0:3] == 'i::' ]
|
||||
cur_keys = list( set( cur_keys ) - set( lazy_keys ) )
|
||||
|
||||
# Append :+
|
||||
append_keys = [ key for key in cur_keys if key[0:2] == ':+' or key[0:3] == 'i:+' ]
|
||||
cur_keys = list( set( cur_keys ) - set( append_keys ) )
|
||||
|
||||
# Remove :-
|
||||
remove_keys = [ key for key in cur_keys if key[0:2] == ':-' or key[0:3] == 'i:-' ]
|
||||
cur_keys = list( set( cur_keys ) - set( remove_keys ) )
|
||||
|
||||
# Set :
|
||||
# Everything left is just a set
|
||||
set_keys = cur_keys
|
||||
|
||||
|
||||
# First process the :: (or lazy) operators
|
||||
# We need to read into this datastructure and apply those first
|
||||
# Otherwise we may get undesired behaviour
|
||||
for key in lazy_keys:
|
||||
# Display key:expression being merged in
|
||||
if debug[0]:
|
||||
output = merge_in.elem_str( key, True )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ), end="" )
|
||||
|
||||
# Construct target key
|
||||
target_key = key[0] == 'i' and "i{0}".format( key[2:] ) or key[1:]
|
||||
|
||||
# If target key exists, replace
|
||||
if target_key in self.data.keys():
|
||||
debug_tag = 'mod'
|
||||
else:
|
||||
debug_tag = 'drp'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Only replace
|
||||
if debug_tag == 'mod':
|
||||
self.data[ target_key ] = merge_in.data[ key ]
|
||||
|
||||
# Then apply : assignment operators
|
||||
for key in set_keys:
|
||||
# Display key:expression being merged in
|
||||
if debug[0]:
|
||||
output = merge_in.elem_str( key, True )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ), end="" )
|
||||
|
||||
# Construct target key
|
||||
target_key = key
|
||||
|
||||
# Indicate if add or modify
|
||||
if target_key in self.data.keys():
|
||||
debug_tag = 'mod'
|
||||
else:
|
||||
debug_tag = 'add'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Set into new datastructure regardless
|
||||
self.data[ target_key ] = merge_in.data[ key ]
|
||||
|
||||
# Only the : is used to set ScanCodes
|
||||
# We need to set the interconnect_id just in case the base context has it set
|
||||
# and in turn influence the new context as well
|
||||
# This must be done during the merge
|
||||
for elem in self.data[ target_key ]:
|
||||
if elem.type == 'ScanCode':
|
||||
self.set_interconnect_id( interconnect_id, elem.triggers )
|
||||
|
||||
# Now apply append operations
|
||||
for key in append_keys:
|
||||
# Display key:expression being merged in
|
||||
if debug[0]:
|
||||
output = merge_in.elem_str( key, True )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ), end="" )
|
||||
|
||||
# Construct target key
|
||||
target_key = key[0] == 'i' and "i:{0}".format( key[3:] ) or ":{0}".format( key[2:] )
|
||||
|
||||
# Alwyays appending
|
||||
debug_tag = 'app'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Extend list if it exists
|
||||
if target_key in self.data.keys():
|
||||
self.data[ target_key ].extend( merge_in.data[ key ] )
|
||||
else:
|
||||
self.data[ target_key ] = merge_in.data[ key ]
|
||||
|
||||
# Finally apply removal operations to this datastructure
|
||||
# If the target removal doesn't exist, ignore silently (show debug message)
|
||||
for key in remove_keys:
|
||||
# Display key:expression being merged in
|
||||
if debug[0]:
|
||||
output = merge_in.elem_str( key, True )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ), end="" )
|
||||
|
||||
# Construct target key
|
||||
target_key = key[0] == 'i' and "i:{0}".format( key[3:] ) or ":{0}".format( key[2:] )
|
||||
|
||||
# Drop right away if target datastructure doesn't have target key
|
||||
if target_key not in self.data.keys():
|
||||
debug_tag = 'drp'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
continue
|
||||
|
||||
# Compare expressions to be removed with the current set
|
||||
# Use strings to compare
|
||||
remove_expressions = [ "{0}".format( expr ) for expr in merge_in.data[ key ] ]
|
||||
current_expressions = [ ( "{0}".format( expr ), expr ) for expr in self.data[ target_key ] ]
|
||||
for string, expr in current_expressions:
|
||||
debug_tag = 'drp'
|
||||
|
||||
# Check if an expression matches
|
||||
if string in remove_expressions:
|
||||
debug_tag = 'rem'
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = self.debug_output[ debug_tag ].format( key )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Remove if found
|
||||
if debug_tag == 'rem':
|
||||
self.data[ target_key ] = [ value for value in self.data.values() if value != expr ]
|
||||
|
||||
def reduction( self ):
|
||||
'''
|
||||
Simplifies datastructure
|
||||
|
||||
Used to replace all trigger HIDCode(USBCode)s with ScanCodes
|
||||
|
||||
NOTE: Make sure to create a new MergeContext before calling this as you lose data and prior context
|
||||
'''
|
||||
scan_code_lookup = {}
|
||||
|
||||
# Build dictionary of single ScanCodes first
|
||||
for key, expr in self.data.items():
|
||||
if expr[0].elems()[0] == 1 and expr[0].triggers[0][0][0].type == 'ScanCode':
|
||||
scan_code_lookup[ key ] = expr
|
||||
|
||||
# Using this dictionary, replace all the trigger USB codes
|
||||
new_data = copy.copy( scan_code_lookup )
|
||||
|
||||
# 1) Single USB Codes trigger results will replace the original ScanCode result
|
||||
# 2)
|
||||
|
||||
#TODO
|
||||
print("YAY")
|
||||
print( scan_code_lookup )
|
||||
|
||||
|
||||
class AnimationData( Data ):
|
||||
'''
|
||||
KLL datastructure for Animation configuration
|
||||
|
||||
Animation -> modifiers
|
||||
'''
|
||||
|
||||
|
||||
class AnimationFrameData( Data ):
|
||||
'''
|
||||
KLL datastructure for Animation Frame configuration
|
||||
|
||||
Animation -> Pixel Settings
|
||||
'''
|
||||
|
||||
|
||||
class CapabilityData( Data ):
|
||||
'''
|
||||
KLL datastructure for Capability mapping
|
||||
|
||||
Capability -> C Function/Identifier
|
||||
'''
|
||||
|
||||
|
||||
class DefineData( Data ):
|
||||
'''
|
||||
KLL datastructure for Define mapping
|
||||
|
||||
Variable -> C Define/Identifier
|
||||
'''
|
||||
|
||||
|
||||
class PixelChannelData( Data ):
|
||||
'''
|
||||
KLL datastructure for Pixel Channel mapping
|
||||
|
||||
Pixel -> Channels
|
||||
'''
|
||||
|
||||
|
||||
class PixelPositionData( Data ):
|
||||
'''
|
||||
KLL datastructure for Pixel Position mapping
|
||||
|
||||
Pixel -> Physical Location
|
||||
'''
|
||||
|
||||
|
||||
class ScanCodePositionData( Data ):
|
||||
'''
|
||||
KLL datastructure for ScanCode Position mapping
|
||||
|
||||
ScanCode -> Physical Location
|
||||
'''
|
||||
|
||||
|
||||
class VariableData( Data ):
|
||||
'''
|
||||
KLL datastructure for Variables and Arrays
|
||||
|
||||
Variable -> Data
|
||||
Array -> Data
|
||||
'''
|
||||
|
||||
|
||||
class Organization:
|
||||
'''
|
||||
Container class for KLL datastructures
|
||||
|
||||
The purpose of these datastructures is to symbolically store at first, and slowly solve/deduplicate expressions.
|
||||
Since the order in which the merges occurs matters, this involves a number of intermediate steps.
|
||||
'''
|
||||
|
||||
def __init__( self ):
|
||||
'''
|
||||
Intialize data structure
|
||||
'''
|
||||
# Setup each of the internal sub-datastructures
|
||||
self.animation_data = AnimationData( self )
|
||||
self.animation_frame_data = AnimationFrameData( self )
|
||||
self.capability_data = CapabilityData( self )
|
||||
self.define_data = DefineData( self )
|
||||
self.mapping_data = MappingData( self )
|
||||
self.pixel_channel_data = PixelChannelData( self )
|
||||
self.pixel_position_data = PixelPositionData( self )
|
||||
self.scan_code_position_data = ScanCodePositionData( self )
|
||||
self.variable_data = VariableData( self )
|
||||
|
||||
# Expression to Datastructure mapping
|
||||
self.data_mapping = {
|
||||
'AssignmentExpression' : {
|
||||
'Array' : self.variable_data,
|
||||
'Variable' : self.variable_data,
|
||||
},
|
||||
'DataAssociationExpression' : {
|
||||
'Animation' : self.animation_data,
|
||||
'AnimationFrame' : self.animation_frame_data,
|
||||
'PixelPosition' : self.pixel_position_data,
|
||||
'ScanCodePosition' : self.scan_code_position_data,
|
||||
},
|
||||
'MapExpression' : {
|
||||
'ScanCode' : self.mapping_data,
|
||||
'USBCode' : self.mapping_data,
|
||||
'Animation' : self.mapping_data,
|
||||
'PixelChannel' : self.pixel_channel_data,
|
||||
},
|
||||
'NameAssociationExpression' : {
|
||||
'Capability' : self.capability_data,
|
||||
'Define' : self.define_data,
|
||||
},
|
||||
}
|
||||
|
||||
def stores( self ):
|
||||
'''
|
||||
Returns list of sub-datastructures
|
||||
'''
|
||||
return [
|
||||
self.animation_data,
|
||||
self.animation_frame_data,
|
||||
self.capability_data,
|
||||
self.define_data,
|
||||
self.mapping_data,
|
||||
self.pixel_channel_data,
|
||||
self.pixel_position_data,
|
||||
self.scan_code_position_data,
|
||||
self.variable_data,
|
||||
]
|
||||
|
||||
def add_expression( self, expression, debug ):
|
||||
'''
|
||||
Add expression to datastructure
|
||||
|
||||
Will automatically determine which type of expression and place in the relevant store
|
||||
|
||||
@param expression: KLL Expression (fully tokenized and parsed)
|
||||
@param debug: Enable debug output
|
||||
'''
|
||||
# Determine type of of Expression
|
||||
expression_type = expression.__class__.__name__
|
||||
|
||||
# Determine Expression Subtype
|
||||
expression_subtype = expression.type
|
||||
|
||||
# Locate datastructure
|
||||
data = self.data_mapping[ expression_type ][ expression_subtype ]
|
||||
|
||||
# Debug output
|
||||
if debug[0]:
|
||||
output = "\t\033[4m{0}\033[0m".format( data.__class__.__name__ )
|
||||
print( debug[1] and output or ansi_escape.sub( '', output ) )
|
||||
|
||||
# Add expression to determined datastructure
|
||||
data.add_expression( expression, debug )
|
||||
|
||||
def merge( self, merge_in, debug ):
|
||||
'''
|
||||
Merge in the given organization to this organization
|
||||
|
||||
This organization serves as the base.
|
||||
|
||||
@param merge_in: Organization to merge into this one
|
||||
@param debug: Enable debug out
|
||||
'''
|
||||
# Merge each of the sub-datastructures
|
||||
for this, that in zip( self.stores(), merge_in.stores() ):
|
||||
this.merge( that, debug )
|
||||
|
||||
def reduction( self ):
|
||||
'''
|
||||
Simplifies datastructure
|
||||
|
||||
NOTE: This will remove data, therefore, context is lost
|
||||
'''
|
||||
for store in self.stores():
|
||||
store.reduction()
|
||||
|
||||
def __repr__( self ):
|
||||
return "{0}".format( self.stores() )
|
||||
|
830
common/parse.py
Normal file
830
common/parse.py
Normal file
@ -0,0 +1,830 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Parsing Expressions
|
||||
|
||||
This file contains various parsing rules and processors used by funcparserlib for KLL
|
||||
|
||||
REMEMBER: When editing parser BNF-like expressions, order matters. Specifically lexer tokens and parser |
|
||||
'''
|
||||
|
||||
# Parser doesn't play nice with linters, disable some checks
|
||||
# pylint: disable=no-self-argument, too-many-public-methods, no-self-use, bad-builtin
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
from common.hid_dict import kll_hid_lookup_dictionary
|
||||
|
||||
from common.id import (
|
||||
AnimationId, AnimationFrameId,
|
||||
CapArgId, CapId,
|
||||
HIDId,
|
||||
NoneId,
|
||||
PixelId, PixelLayerId,
|
||||
ScanCodeId
|
||||
)
|
||||
from common.modifier import AnimationModifierList
|
||||
from common.schedule import AnalogScheduleParam, ScheduleParam, Time
|
||||
|
||||
from funcparserlib.lexer import Token
|
||||
from funcparserlib.parser import (some, a, many, oneplus, skip, maybe)
|
||||
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
## Parsing Functions
|
||||
|
||||
class Make:
|
||||
'''
|
||||
Collection of parse string interpreters
|
||||
'''
|
||||
|
||||
def scanCode( token ):
|
||||
'''
|
||||
Converts a raw scan code string into an ScanCodeId /w integer
|
||||
|
||||
S0x10 -> 16
|
||||
'''
|
||||
if isinstance( token, int ):
|
||||
return ScanCodeId( token )
|
||||
else:
|
||||
return ScanCodeId( int( token[1:], 0 ) )
|
||||
|
||||
def hidCode( type, token ):
|
||||
'''
|
||||
Convert a given raw hid token string to an integer /w a type
|
||||
|
||||
U"Enter" -> USB, Enter(0x28)
|
||||
'''
|
||||
# If already converted to a HIDId, just return
|
||||
if isinstance( token, HIDId ):
|
||||
return token
|
||||
|
||||
# If first character is a U or I, strip
|
||||
if token[0] == "U" or token[0] == "I":
|
||||
token = token[1:]
|
||||
# CONS specifier
|
||||
elif 'CONS' in token:
|
||||
token = token[4:]
|
||||
# SYS specifier
|
||||
elif 'SYS' in token:
|
||||
token = token[3:]
|
||||
|
||||
# If using string representation of USB Code, do lookup, case-insensitive
|
||||
if '"' in token:
|
||||
try:
|
||||
hidCode = kll_hid_lookup_dictionary[ type ][ token[1:-1].upper() ][1]
|
||||
except LookupError as err:
|
||||
print ( "{0} {1} is an invalid USB HID Code Lookup...".format( ERROR, err ) )
|
||||
raise
|
||||
else:
|
||||
# Already tokenized
|
||||
if (
|
||||
type == 'USBCode' and token[0] == 'USB'
|
||||
or
|
||||
type == 'SysCode' and token[0] == 'SYS'
|
||||
or
|
||||
type == 'ConsCode' and token[0] == 'CONS'
|
||||
or
|
||||
type == 'IndCode' and token[0] == 'IND'
|
||||
):
|
||||
hidCode = token[1]
|
||||
# Convert
|
||||
else:
|
||||
hidCode = int( token, 0 )
|
||||
|
||||
return HIDId( type, hidCode )
|
||||
|
||||
|
||||
def usbCode( token ):
|
||||
'''
|
||||
Convert a given raw USB Keyboard hid token string to an integer /w a type
|
||||
|
||||
U"Enter" -> USB, Enter(0x28)
|
||||
'''
|
||||
return Make.hidCode( 'USBCode', token )
|
||||
|
||||
def consCode( token ):
|
||||
'''
|
||||
Convert a given raw Consumer Control hid token string to an integer /w a type
|
||||
'''
|
||||
return Make.hidCode( 'ConsCode', token )
|
||||
|
||||
def sysCode( token ):
|
||||
'''
|
||||
Convert a given raw System Control hid token string to an integer /w a type
|
||||
'''
|
||||
return Make.hidCode( 'SysCode', token )
|
||||
|
||||
def indCode( token ):
|
||||
'''
|
||||
Convert a given raw Indicator hid token string to an integer /w a type
|
||||
'''
|
||||
return Make.hidCode( 'IndCode', token )
|
||||
|
||||
def animation( name ):
|
||||
'''
|
||||
Converts a raw animation value into an AnimationId /w name
|
||||
|
||||
A"myname" -> myname
|
||||
'''
|
||||
if name[0] == "A":
|
||||
return AnimationId( name[2:-1] )
|
||||
else:
|
||||
return AnimationId( name )
|
||||
|
||||
def animationTrigger( animation, frame_indices ):
|
||||
'''
|
||||
Generate either an AnimationId or an AnimationFrameId
|
||||
|
||||
frame_indices indicate that this is an AnimationFrameId
|
||||
'''
|
||||
trigger_list = []
|
||||
# AnimationFrameId
|
||||
if len( frame_indices ) > 0:
|
||||
for index in frame_indices:
|
||||
trigger_list.append( [ [ AnimationFrameId( animation, index ) ] ] )
|
||||
# AnimationId
|
||||
else:
|
||||
trigger_list.append( [ [ AnimationId( animation ) ] ] )
|
||||
|
||||
return trigger_list
|
||||
|
||||
def animationCapability( animation, modifiers ):
|
||||
'''
|
||||
Apply modifiers to AnimationId
|
||||
'''
|
||||
if modifiers is not None:
|
||||
animation.setModifiers( modifiers )
|
||||
return [ animation ]
|
||||
|
||||
def animationModlist( modifiers ):
|
||||
'''
|
||||
Build an AnimationModifierList
|
||||
|
||||
Only used for animation data association
|
||||
'''
|
||||
modlist = AnimationModifierList()
|
||||
modlist.setModifiers( modifiers )
|
||||
return modlist
|
||||
|
||||
def pixelCapability( pixels, modifiers ):
|
||||
'''
|
||||
Apply modifiers to list of pixels/pixellists
|
||||
|
||||
Results in a combination of pixel capabilities
|
||||
'''
|
||||
pixelcap_list = []
|
||||
for pixel in pixels:
|
||||
pixel.setModifiers( modifiers )
|
||||
pixelcap_list.append( pixel )
|
||||
return pixelcap_list
|
||||
|
||||
def pixel( token ):
|
||||
'''
|
||||
Converts a raw pixel value into a PixelId /w integer
|
||||
|
||||
P0x3 -> 3
|
||||
'''
|
||||
if isinstance( token, int ):
|
||||
return PixelId( token )
|
||||
else:
|
||||
return PixelId( int( token[1:], 0 ) )
|
||||
|
||||
def pixel_list( pixel_list ):
|
||||
'''
|
||||
Converts a list a numbers into a list of PixelIds
|
||||
'''
|
||||
pixels = []
|
||||
for pixel in pixel_list:
|
||||
pixels.append( PixelId( pixel ) )
|
||||
return pixels
|
||||
|
||||
def pixelLayer( token ):
|
||||
'''
|
||||
Converts a raw pixel layer value into a PixelLayerId /w integer
|
||||
|
||||
PL0x3 -> 3
|
||||
'''
|
||||
if isinstance( token, int ):
|
||||
return PixelLayerId( token )
|
||||
else:
|
||||
return PixelLayerId( int( token[2:], 0 ) )
|
||||
|
||||
def pixelLayer_list( layer_list ):
|
||||
'''
|
||||
Converts a list a numbers into a list of PixelLayerIds
|
||||
'''
|
||||
layers = []
|
||||
for layer in layer_list:
|
||||
layers.append( PixelLayerId( layer ) )
|
||||
return layers
|
||||
|
||||
def pixelchan( pixel_list, chans ):
|
||||
'''
|
||||
Apply channels to PixelId
|
||||
|
||||
Only one pixel at a time can be mapped, hence pixel_list[0]
|
||||
'''
|
||||
pixel = pixel_list[0]
|
||||
pixel.setChannels( chans )
|
||||
return pixel
|
||||
|
||||
def pixelmod( pixels, modifiers ):
|
||||
'''
|
||||
Apply modifiers to list of pixels/pixellists
|
||||
|
||||
Results in a combination of pixel capabilities
|
||||
'''
|
||||
pixelcap_list = []
|
||||
for pixel in pixels:
|
||||
pixel.setModifiers( modifiers )
|
||||
pixelcap_list.append( pixel )
|
||||
return pixelcap_list
|
||||
|
||||
def position( token ):
|
||||
'''
|
||||
Physical position split
|
||||
|
||||
x:20 -> (x, 20)
|
||||
'''
|
||||
return token.split(':')
|
||||
|
||||
def usbCode_number( token ):
|
||||
'''
|
||||
USB Keyboard HID Code lookup
|
||||
'''
|
||||
return HIDId( 'USBCode', token )
|
||||
|
||||
def consCode_number( token ):
|
||||
'''
|
||||
Consumer Control HID Code lookup
|
||||
'''
|
||||
return HIDId( 'ConsCode', token )
|
||||
|
||||
def sysCode_number( token ):
|
||||
'''
|
||||
System Control HID Code lookup
|
||||
'''
|
||||
return HIDId( 'SysCode', token )
|
||||
|
||||
def indCode_number( token ):
|
||||
'''
|
||||
Indicator HID Code lookup
|
||||
'''
|
||||
return HIDId( 'IndCode', token )
|
||||
|
||||
def none( token ):
|
||||
'''
|
||||
Replace key-word with NoneId specifier (which indicates a noneOut capability)
|
||||
'''
|
||||
return [[[NoneId()]]]
|
||||
|
||||
def seqString( token ):
|
||||
'''
|
||||
Converts sequence string to a sequence of combinations
|
||||
|
||||
'Ab' -> U"Shift" + U"A", U"B"
|
||||
'''
|
||||
# TODO - Add locale support
|
||||
|
||||
# Shifted Characters, and amount to move by to get non-shifted version
|
||||
# US ANSI
|
||||
shiftCharacters = (
|
||||
( "ABCDEFGHIJKLMNOPQRSTUVWXYZ", 0x20 ),
|
||||
( "+", 0x12 ),
|
||||
( "&(", 0x11 ),
|
||||
( "!#$%", 0x10 ),
|
||||
( "*", 0x0E ),
|
||||
( ")", 0x07 ),
|
||||
( '"', 0x05 ),
|
||||
( ":", 0x01 ),
|
||||
( "@", -0x0E ),
|
||||
( "<>?", -0x10 ),
|
||||
( "~", -0x1E ),
|
||||
( "{}|", -0x20 ),
|
||||
( "^", -0x28 ),
|
||||
( "_", -0x32 ),
|
||||
)
|
||||
|
||||
listOfLists = []
|
||||
shiftKey = kll_hid_lookup_dictionary['USBCode']["SHIFT"]
|
||||
|
||||
# Creates a list of USB codes from the string: sequence (list) of combos (lists)
|
||||
for char in token[1:-1]:
|
||||
processedChar = char
|
||||
|
||||
# Whether or not to create a combo for this sequence with a shift
|
||||
shiftCombo = False
|
||||
|
||||
# Depending on the ASCII character, convert to single character or Shift + character
|
||||
for pair in shiftCharacters:
|
||||
if char in pair[0]:
|
||||
shiftCombo = True
|
||||
processedChar = chr( ord( char ) + pair[1] )
|
||||
break
|
||||
|
||||
# Do KLL HID Lookup on non-shifted character
|
||||
# NOTE: Case-insensitive, which is why the shift must be pre-computed
|
||||
usb_code = kll_hid_lookup_dictionary['USBCode'][ processedChar.upper() ]
|
||||
|
||||
# Create Combo for this character, add shift key if shifted
|
||||
charCombo = []
|
||||
if shiftCombo:
|
||||
charCombo = [ [ HIDId( 'USBCode', shiftKey[1] ) ] ]
|
||||
charCombo.append( [ HIDId( 'USBCode', usb_code[1] ) ] )
|
||||
|
||||
# Add to list of lists
|
||||
listOfLists.append( charCombo )
|
||||
|
||||
return listOfLists
|
||||
|
||||
def string( token ):
|
||||
'''
|
||||
Converts a raw string to a Python string
|
||||
|
||||
"this string" -> this string
|
||||
'''
|
||||
return token[1:-1]
|
||||
|
||||
def unseqString( token ):
|
||||
'''
|
||||
Converts a raw sequence string to a Python string
|
||||
|
||||
'this string' -> this string
|
||||
'''
|
||||
return token[1:-1]
|
||||
|
||||
def number( token ):
|
||||
'''
|
||||
Convert string number to Python integer
|
||||
'''
|
||||
return int( token, 0 )
|
||||
|
||||
def timing( token ):
|
||||
'''
|
||||
Convert raw timing parameter to integer time and determine units
|
||||
|
||||
1ms -> 1, ms
|
||||
'''
|
||||
# Find ms, us, or s
|
||||
if 'ms' in token:
|
||||
unit = 'ms'
|
||||
num = token.split('m')[0]
|
||||
elif 'us' in token:
|
||||
unit = 'us'
|
||||
num = token.split('u')[0]
|
||||
elif 'ns' in token:
|
||||
unit = 'ns'
|
||||
num = token.split('n')[0]
|
||||
elif 's' in token:
|
||||
unit = 's'
|
||||
num = token.split('s')[0]
|
||||
else:
|
||||
print ( "{0} cannot find timing unit in token '{1}'".format( ERROR, token ) )
|
||||
|
||||
return Time( float( num ), unit )
|
||||
|
||||
def specifierTiming( timing ):
|
||||
'''
|
||||
When only timing is given, infer state at a later stage from the context of the mapping
|
||||
'''
|
||||
return ScheduleParam( None, timing )
|
||||
|
||||
def specifierState( state, timing=None ):
|
||||
'''
|
||||
Generate a Schedule Parameter
|
||||
Automatically mutates itself into the correct object type
|
||||
'''
|
||||
return ScheduleParam( state, timing )
|
||||
|
||||
def specifierAnalog( value ):
|
||||
'''
|
||||
Generate an Analog Schedule Parameter
|
||||
'''
|
||||
return AnalogScheduleParam( value )
|
||||
|
||||
def specifierUnroll( identifier, schedule_params ):
|
||||
'''
|
||||
Unroll specifiers into the trigger/result identifier
|
||||
|
||||
First, combine all Schedule Parameters into a Schedul
|
||||
Then attach Schedule to the identifier
|
||||
|
||||
If the identifier is a list, then iterate through them
|
||||
and apply the schedule to each
|
||||
'''
|
||||
# Check if this is a list of identifiers
|
||||
if isinstance( identifier, list ):
|
||||
for ident in identifier:
|
||||
ident.setSchedule( schedule_params )
|
||||
return identifier
|
||||
else:
|
||||
identifier.setSchedule( schedule_params )
|
||||
|
||||
return [ identifier ]
|
||||
|
||||
|
||||
# Range can go from high to low or low to high
|
||||
def scanCode_range( rangeVals ):
|
||||
'''
|
||||
Scan Code range expansion
|
||||
|
||||
S[0x10-0x12] -> S0x10, S0x11, S0x12
|
||||
'''
|
||||
start = rangeVals[0]
|
||||
end = rangeVals[1]
|
||||
|
||||
# Swap start, end if start is greater than end
|
||||
if start > end:
|
||||
start, end = end, start
|
||||
|
||||
# Iterate from start to end, and generate the range
|
||||
values = list( range( start, end + 1 ) )
|
||||
|
||||
# Generate ScanCodeIds
|
||||
return [ ScanCodeId( v ) for v in values ]
|
||||
|
||||
# Range can go from high to low or low to high
|
||||
# Warn on 0-9 for USBCodes (as this does not do what one would expect) TODO
|
||||
# Lookup USB HID tags and convert to a number
|
||||
def hidCode_range( type, rangeVals ):
|
||||
'''
|
||||
HID Code range expansion
|
||||
|
||||
U["A"-"C"] -> U"A", U"B", U"C"
|
||||
'''
|
||||
|
||||
# Check if already integers
|
||||
if isinstance( rangeVals[0], int ):
|
||||
start = rangeVals[0]
|
||||
else:
|
||||
start = Make.hidCode( type, rangeVals[0] ).uid
|
||||
|
||||
if isinstance( rangeVals[1], int ):
|
||||
end = rangeVals[1]
|
||||
else:
|
||||
end = Make.hidCode( type, rangeVals[1] ).uid
|
||||
|
||||
# Swap start, end if start is greater than end
|
||||
if start > end:
|
||||
start, end = end, start
|
||||
|
||||
# Iterate from start to end, and generate the range
|
||||
listRange = list( range( start, end + 1 ) )
|
||||
|
||||
# Convert each item in the list to a tuple
|
||||
for item in range( len( listRange ) ):
|
||||
listRange[ item ] = HIDId( type, listRange[ item ] )
|
||||
return listRange
|
||||
|
||||
def usbCode_range( rangeVals ):
|
||||
'''
|
||||
USB Keyboard HID Code range expansion
|
||||
'''
|
||||
return Make.hidCode_range( 'USBCode', rangeVals )
|
||||
|
||||
def sysCode_range( rangeVals ):
|
||||
'''
|
||||
System Control HID Code range expansion
|
||||
'''
|
||||
return Make.hidCode_range( 'SysCode', rangeVals )
|
||||
|
||||
def consCode_range( rangeVals ):
|
||||
'''
|
||||
Consumer Control HID Code range expansion
|
||||
'''
|
||||
return Make.hidCode_range( 'ConsCode', rangeVals )
|
||||
|
||||
def indCode_range( rangeVals ):
|
||||
'''
|
||||
Indicator HID Code range expansion
|
||||
'''
|
||||
return Make.hidCode_range( 'IndCode', rangeVals )
|
||||
|
||||
def range( start, end ):
|
||||
'''
|
||||
Converts a start and end points of a range to a list of numbers
|
||||
|
||||
Can go low to high or high to low
|
||||
'''
|
||||
# High to low
|
||||
if end < start:
|
||||
return list( range( end, start + 1 ) )
|
||||
|
||||
# Low to high
|
||||
return list( range( start, end + 1 ) )
|
||||
|
||||
def capArg( argument, width=None ):
|
||||
'''
|
||||
Converts a capability argument:width to a CapArgId
|
||||
|
||||
If no width is specified, it is ignored
|
||||
'''
|
||||
return CapArgId( argument, width )
|
||||
|
||||
def capUsage( name, arguments ):
|
||||
'''
|
||||
Converts a capability tuple, argument list to a CapId Usage
|
||||
'''
|
||||
return CapId( name, 'Usage', arguments )
|
||||
|
||||
|
||||
|
||||
### Rules ###
|
||||
|
||||
## Base Rules
|
||||
|
||||
const = lambda x: lambda _: x
|
||||
unarg = lambda f: lambda x: f(*x)
|
||||
flatten = lambda list: sum( list, [] )
|
||||
|
||||
tokenValue = lambda x: x.value
|
||||
tokenType = lambda t: some( lambda x: x.type == t ) >> tokenValue
|
||||
operator = lambda s: a( Token( 'Operator', s ) ) >> tokenValue
|
||||
parenthesis = lambda s: a( Token( 'Parenthesis', s ) ) >> tokenValue
|
||||
bracket = lambda s: a( Token( 'Bracket', s ) ) >> tokenValue
|
||||
eol = a( Token( 'EndOfLine', ';' ) )
|
||||
|
||||
def maybeFlatten( items ):
|
||||
'''
|
||||
Iterate through top-level lists
|
||||
Flatten, only if the element is also a list
|
||||
|
||||
[[1,2],3,[[4,5]]] -> [1,2,3,[4,5]]
|
||||
'''
|
||||
new_list = []
|
||||
for elem in items:
|
||||
# Flatten only if a list
|
||||
if isinstance( elem, list ):
|
||||
new_list.extend( elem )
|
||||
else:
|
||||
new_list.append( elem )
|
||||
return new_list
|
||||
|
||||
def listElem( item ):
|
||||
'''
|
||||
Convert to a list element
|
||||
'''
|
||||
return [ item ]
|
||||
|
||||
def listToTuple( items ):
|
||||
'''
|
||||
Convert list to a tuple
|
||||
'''
|
||||
return tuple( items )
|
||||
|
||||
def oneLayerFlatten( items ):
|
||||
'''
|
||||
Flatten only the top layer (list of lists of ...)
|
||||
'''
|
||||
mainList = []
|
||||
for sublist in items:
|
||||
for item in sublist:
|
||||
mainList.append( item )
|
||||
|
||||
return mainList
|
||||
|
||||
def optionExpansion( sequences ):
|
||||
'''
|
||||
Expand ranges of values in the 3rd dimension of the list, to a list of 2nd lists
|
||||
|
||||
i.e. [ sequence, [ combo, [ range ] ] ] --> [ [ sequence, [ combo ] ], <option 2>, <option 3> ]
|
||||
'''
|
||||
expandedSequences = []
|
||||
|
||||
# Total number of combinations of the sequence of combos that needs to be generated
|
||||
totalCombinations = 1
|
||||
|
||||
# List of leaf lists, with number of leaves
|
||||
maxLeafList = []
|
||||
|
||||
# Traverse to the leaf nodes, and count the items in each leaf list
|
||||
for sequence in sequences:
|
||||
for combo in sequence:
|
||||
rangeLen = len( combo )
|
||||
totalCombinations *= rangeLen
|
||||
maxLeafList.append( rangeLen )
|
||||
|
||||
# Counter list to keep track of which combination is being generated
|
||||
curLeafList = [0] * len( maxLeafList )
|
||||
|
||||
# Generate a list of permuations of the sequence of combos
|
||||
for count in range( 0, totalCombinations ):
|
||||
expandedSequences.append( [] ) # Prepare list for adding the new combination
|
||||
pos = 0
|
||||
|
||||
# Traverse sequence of combos to generate permuation
|
||||
for sequence in sequences:
|
||||
expandedSequences[ -1 ].append( [] )
|
||||
for combo in sequence:
|
||||
expandedSequences[ -1 ][ -1 ].append( combo[ curLeafList[ pos ] ] )
|
||||
pos += 1
|
||||
|
||||
# Increment combination tracker
|
||||
for leaf in range( 0, len( curLeafList ) ):
|
||||
curLeafList[ leaf ] += 1
|
||||
|
||||
# Reset this position, increment next position (if it exists), then stop
|
||||
if curLeafList[ leaf ] >= maxLeafList[ leaf ]:
|
||||
curLeafList[ leaf ] = 0
|
||||
if leaf + 1 < len( curLeafList ):
|
||||
curLeafList[ leaf + 1 ] += 1
|
||||
|
||||
return expandedSequences
|
||||
|
||||
def listit( t ):
|
||||
'''
|
||||
Convert tuple of tuples to list of lists
|
||||
'''
|
||||
return list( map( listit, t ) ) if isinstance( t, ( list, tuple ) ) else t
|
||||
|
||||
def tupleit( t ):
|
||||
'''
|
||||
Convert list of lists to tuple of tuples
|
||||
'''
|
||||
return tuple( map( tupleit, t ) ) if isinstance( t, ( tuple, list ) ) else t
|
||||
|
||||
|
||||
## Sub Rules
|
||||
|
||||
usbCode = tokenType('USBCode') >> Make.usbCode
|
||||
scanCode = tokenType('ScanCode') >> Make.scanCode
|
||||
consCode = tokenType('ConsCode') >> Make.consCode
|
||||
sysCode = tokenType('SysCode') >> Make.sysCode
|
||||
indCode = tokenType('IndCode') >> Make.indCode
|
||||
animation = tokenType('Animation') >> Make.animation
|
||||
pixel = tokenType('Pixel') >> Make.pixel
|
||||
pixelLayer = tokenType('PixelLayer') >> Make.pixelLayer
|
||||
none = tokenType('None') >> Make.none
|
||||
position = tokenType('Position') >> Make.position
|
||||
name = tokenType('Name')
|
||||
number = tokenType('Number') >> Make.number
|
||||
timing = tokenType('Timing') >> Make.timing
|
||||
comma = tokenType('Comma')
|
||||
dash = tokenType('Dash')
|
||||
plus = tokenType('Plus')
|
||||
content = tokenType('VariableContents')
|
||||
string = tokenType('String') >> Make.string
|
||||
unString = tokenType('String') # When the double quotes are still needed for internal processing
|
||||
seqString = tokenType('SequenceString') >> Make.seqString
|
||||
unseqString = tokenType('SequenceString') >> Make.unseqString # For use with variables
|
||||
pixelOperator = tokenType('PixelOperator')
|
||||
|
||||
# Code variants
|
||||
code_begin = tokenType('CodeBegin')
|
||||
code_end = tokenType('CodeEnd')
|
||||
|
||||
# Specifier
|
||||
specifier_basic = ( timing >> Make.specifierTiming ) | ( name >> Make.specifierState )
|
||||
specifier_complex = ( name + skip( operator(':') ) + timing ) >> unarg( Make.specifierState )
|
||||
specifier_state = specifier_complex | specifier_basic
|
||||
specifier_analog = number >> Make.specifierAnalog
|
||||
specifier_list = skip( parenthesis('(') ) + many( ( specifier_state | specifier_analog ) + skip( maybe( comma ) ) ) + skip( parenthesis(')') )
|
||||
|
||||
# Scan Codes
|
||||
scanCode_start = tokenType('ScanCodeStart')
|
||||
scanCode_range = number + skip( dash ) + number >> Make.scanCode_range
|
||||
scanCode_listElem = number >> Make.scanCode
|
||||
scanCode_specifier = ( scanCode_range | scanCode_listElem ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
scanCode_innerList = many( scanCode_specifier + skip( maybe( comma ) ) ) >> flatten
|
||||
scanCode_expanded = skip( scanCode_start ) + scanCode_innerList + skip( code_end ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
scanCode_elem = scanCode + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
scanCode_combo = oneplus( ( scanCode_expanded | scanCode_elem ) + skip( maybe( plus ) ) )
|
||||
scanCode_sequence = oneplus( scanCode_combo + skip( maybe( comma ) ) )
|
||||
scanCode_single = ( skip( scanCode_start ) + scanCode_listElem + skip( code_end ) ) | scanCode
|
||||
|
||||
# Cons Codes
|
||||
consCode_start = tokenType('ConsCodeStart')
|
||||
consCode_number = number >> Make.consCode_number
|
||||
consCode_range = ( consCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.consCode_range
|
||||
consCode_listElemTag = unString >> Make.consCode
|
||||
consCode_listElem = ( consCode_number | consCode_listElemTag )
|
||||
consCode_specifier = ( consCode_range | consCode_listElem ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
consCode_innerList = oneplus( consCode_specifier + skip( maybe( comma ) ) ) >> flatten
|
||||
consCode_expanded = skip( consCode_start ) + consCode_innerList + skip( code_end ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
consCode_elem = consCode + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
|
||||
# Sys Codes
|
||||
sysCode_start = tokenType('SysCodeStart')
|
||||
sysCode_number = number >> Make.sysCode_number
|
||||
sysCode_range = ( sysCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.sysCode_range
|
||||
sysCode_listElemTag = unString >> Make.sysCode
|
||||
sysCode_listElem = ( sysCode_number | sysCode_listElemTag )
|
||||
sysCode_specifier = ( sysCode_range | sysCode_listElem ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
sysCode_innerList = oneplus( sysCode_specifier + skip( maybe( comma ) ) ) >> flatten
|
||||
sysCode_expanded = skip( sysCode_start ) + sysCode_innerList + skip( code_end ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
sysCode_elem = sysCode + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
|
||||
# Indicator Codes
|
||||
indCode_start = tokenType('IndicatorStart')
|
||||
indCode_number = number >> Make.indCode_number
|
||||
indCode_range = ( indCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.indCode_range
|
||||
indCode_listElemTag = unString >> Make.indCode
|
||||
indCode_listElem = ( indCode_number | indCode_listElemTag )
|
||||
indCode_specifier = ( indCode_range | indCode_listElem ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
indCode_innerList = oneplus( indCode_specifier + skip( maybe( comma ) ) ) >> flatten
|
||||
indCode_expanded = skip( indCode_start ) + indCode_innerList + skip( code_end ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
indCode_elem = indCode + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
|
||||
# USB Codes
|
||||
usbCode_start = tokenType('USBCodeStart')
|
||||
usbCode_number = number >> Make.usbCode_number
|
||||
usbCode_range = ( usbCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.usbCode_range
|
||||
usbCode_listElemTag = unString >> Make.usbCode
|
||||
usbCode_listElem = ( usbCode_number | usbCode_listElemTag )
|
||||
usbCode_specifier = ( usbCode_range | usbCode_listElem ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
usbCode_innerList = oneplus( usbCode_specifier + skip( maybe( comma ) ) ) >> flatten
|
||||
usbCode_expanded = skip( usbCode_start ) + usbCode_innerList + skip( code_end ) + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
usbCode_elem = usbCode + maybe( specifier_list ) >> unarg( Make.specifierUnroll )
|
||||
|
||||
# HID Codes
|
||||
hidCode_elem = usbCode_expanded | usbCode_elem | sysCode_expanded | sysCode_elem | consCode_expanded | consCode_elem | indCode_expanded | indCode_elem
|
||||
|
||||
usbCode_combo = oneplus( hidCode_elem + skip( maybe( plus ) ) ) >> listElem
|
||||
usbCode_sequence = oneplus( ( usbCode_combo | seqString ) + skip( maybe( comma ) ) ) >> oneLayerFlatten
|
||||
|
||||
# Pixels
|
||||
pixel_start = tokenType('PixelStart')
|
||||
pixel_range = ( number ) + skip( dash ) + ( number ) >> unarg( Make.range )
|
||||
pixel_listElem = number >> listElem
|
||||
pixel_innerList = many( ( pixel_range | pixel_listElem ) + skip( maybe( comma ) ) ) >> flatten >> Make.pixel_list
|
||||
pixel_expanded = skip( pixel_start ) + pixel_innerList + skip( code_end )
|
||||
pixel_elem = pixel >> listElem
|
||||
|
||||
# Pixel Layer
|
||||
pixellayer_start = tokenType('PixelLayerStart')
|
||||
pixellayer_range = ( number ) + skip( dash ) + ( number ) >> unarg( Make.range )
|
||||
pixellayer_listElem = number >> listElem
|
||||
pixellayer_innerList = many( ( pixellayer_range | pixellayer_listElem ) + skip( maybe( comma ) ) ) >> flatten >> Make.pixelLayer_list
|
||||
pixellayer_expanded = skip( pixellayer_start ) + pixellayer_innerList + skip( code_end )
|
||||
pixellayer_elem = pixelLayer >> listElem
|
||||
|
||||
# Pixel Channels
|
||||
pixelchan_chans = many( number + skip( operator(':') ) + number + skip( maybe( comma ) ) )
|
||||
pixelchan_elem = ( pixel_expanded | pixel_elem ) + skip( parenthesis('(') ) + pixelchan_chans + skip( parenthesis(')') ) >> unarg( Make.pixelchan )
|
||||
|
||||
# Pixel Mods
|
||||
pixelmod_mods = many( maybe( pixelOperator | plus | dash ) + number + skip( maybe( comma ) ) )
|
||||
pixelmod_layer = ( pixellayer_expanded | pixellayer_elem )
|
||||
pixelmod_elem = ( pixel_expanded | pixel_elem | pixelmod_layer ) + skip( parenthesis('(') ) + pixelmod_mods + skip( parenthesis(')') ) >> unarg( Make.pixelmod )
|
||||
|
||||
# Pixel Capability
|
||||
pixel_capability = pixelmod_elem
|
||||
|
||||
# Animations
|
||||
animation_start = tokenType('AnimationStart')
|
||||
animation_name = name
|
||||
animation_frame_range = ( number ) + skip( dash ) + ( number ) >> unarg( Make.range )
|
||||
animation_name_frame = many( ( animation_frame_range | number ) + skip( maybe( comma ) ) ) >> maybeFlatten
|
||||
animation_def = skip( animation_start ) + animation_name + skip( code_end ) >> Make.animation
|
||||
animation_expanded = skip( animation_start ) + animation_name + skip( maybe( comma ) ) + animation_name_frame + skip( code_end ) >> unarg( Make.animationTrigger )
|
||||
animation_flattened = animation_expanded >> flatten >> flatten
|
||||
animation_elem = animation
|
||||
|
||||
# Animation Modifier
|
||||
animation_modifier = many( ( name | number ) + maybe( skip( operator(':') ) + number ) + skip( maybe( comma ) ) )
|
||||
animation_modlist = animation_modifier >> Make.animationModlist
|
||||
|
||||
# Animation Capability
|
||||
animation_capability = ( ( animation_def | animation_elem ) + maybe( skip( parenthesis('(') ) + animation_modifier + skip( parenthesis(')') ) ) ) >> unarg( Make.animationCapability )
|
||||
|
||||
# Capabilities
|
||||
capFunc_argument = number >> Make.capArg # TODO Allow for symbolic arguments, i.e. arrays and variables
|
||||
capFunc_arguments = many( capFunc_argument + skip( maybe( comma ) ) )
|
||||
capFunc_elem = name + skip( parenthesis('(') ) + capFunc_arguments + skip( parenthesis(')') ) >> unarg( Make.capUsage ) >> listElem
|
||||
capFunc_combo = oneplus( ( hidCode_elem | capFunc_elem | animation_capability | pixel_capability ) + skip( maybe( plus ) ) ) >> listElem
|
||||
capFunc_sequence = oneplus( ( capFunc_combo | seqString ) + skip( maybe( comma ) ) ) >> oneLayerFlatten
|
||||
|
||||
# Trigger / Result Codes
|
||||
triggerCode_outerList = scanCode_sequence >> optionExpansion
|
||||
triggerUSBCode_outerList = usbCode_sequence >> optionExpansion
|
||||
resultCode_outerList = ( ( capFunc_sequence >> optionExpansion ) | none )
|
||||
|
||||
# Positions
|
||||
position_list = oneplus( position + skip( maybe( comma ) ) )
|
||||
|
114
common/position.py
Normal file
114
common/position.py
Normal file
@ -0,0 +1,114 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Position Containers
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Position:
|
||||
'''
|
||||
Identifier position
|
||||
Each position can have up to 6 different types of measurements
|
||||
|
||||
Distance:
|
||||
x
|
||||
y
|
||||
z
|
||||
|
||||
Angular:
|
||||
ry
|
||||
ry
|
||||
rz
|
||||
'''
|
||||
_parameters = [ 'x', 'y', 'z', 'rx', 'ry', 'rz' ]
|
||||
x = None
|
||||
y = None
|
||||
z = None
|
||||
rx = None
|
||||
ry = None
|
||||
rz = None
|
||||
|
||||
def __init( self ):
|
||||
# Set all the _parameters to None
|
||||
for param in self._parameters:
|
||||
setattr( self, param, None )
|
||||
|
||||
def positionSet( self ):
|
||||
'''
|
||||
Returns True if any position has been set
|
||||
'''
|
||||
for param in self._parameters:
|
||||
if getattr( self, param ) is not None:
|
||||
return True
|
||||
return False
|
||||
|
||||
def setPosition( self, positions ):
|
||||
'''
|
||||
Applies given list of position measurements
|
||||
|
||||
None signifies an undefined position which may be assigned at a later point.
|
||||
Otherwise, it will be set to 0 at a later stage
|
||||
|
||||
If a position is already set, do not overwrite, expressions are read inside->out
|
||||
'''
|
||||
for position in positions:
|
||||
name = position[0]
|
||||
value = position[1]
|
||||
|
||||
# Check to make sure parameter is valid
|
||||
if name not in self._parameters:
|
||||
print( "{0} '{1}' is not a valid position parameter.".format( ERROR, name ) )
|
||||
continue
|
||||
|
||||
# Only set if None
|
||||
if getattr( self, name ) is None:
|
||||
setattr( self, name, value )
|
||||
|
||||
def strPosition( self ):
|
||||
'''
|
||||
__repr__ of Position when multiple inheritance is used
|
||||
'''
|
||||
output = ""
|
||||
|
||||
# Check each of the position parameters, only show the ones that are not None
|
||||
count = 0
|
||||
for param in self._parameters:
|
||||
value = getattr( self, param )
|
||||
if value is not None:
|
||||
if count > 0:
|
||||
output += ","
|
||||
output += "{0}:{1}".format( param, value )
|
||||
count += 1
|
||||
|
||||
return output
|
||||
|
||||
def __repr__( self ):
|
||||
return self.strPosition()
|
||||
|
180
common/schedule.py
Normal file
180
common/schedule.py
Normal file
@ -0,0 +1,180 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Schedule Containers
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Time:
|
||||
'''
|
||||
Time parameter
|
||||
'''
|
||||
def __init__( self, time, unit ):
|
||||
self.time = time
|
||||
self.unit = unit
|
||||
|
||||
def __repr__( self ):
|
||||
return "{0}{1}".format( self.time, self.unit )
|
||||
|
||||
|
||||
class Schedule:
|
||||
'''
|
||||
Identifier schedule
|
||||
Each schedule may have multiple parameters configuring how the element is scheduled
|
||||
|
||||
Used for trigger and result elements
|
||||
'''
|
||||
def __init__( self ):
|
||||
self.parameters = None
|
||||
|
||||
def setSchedule( self, parameters ):
|
||||
'''
|
||||
Applies given list of Schedule Parameters to Schedule
|
||||
|
||||
None signifies an undefined schedule which allows free-form scheduling
|
||||
at either a later stage or at the convenience of the device firmware/driver
|
||||
|
||||
If schedule is already set, do not overwrite, expressions are read inside->out
|
||||
'''
|
||||
# Ignore if already set
|
||||
if self.parameters is not None:
|
||||
return
|
||||
|
||||
self.parameters = parameters
|
||||
|
||||
def strSchedule( self ):
|
||||
'''
|
||||
__repr__ of Schedule when multiple inheritance is used
|
||||
'''
|
||||
output = ""
|
||||
if self.parameters is not None:
|
||||
for index, param in enumerate( self.parameters ):
|
||||
if index > 0:
|
||||
output += ","
|
||||
output += "{0}".format( param )
|
||||
return output
|
||||
|
||||
def __repr__( self ):
|
||||
return self.strSchedule()
|
||||
|
||||
|
||||
class ScheduleParam:
|
||||
'''
|
||||
Schedule parameter
|
||||
|
||||
In the case of a Timing parameter, the base type is unknown and must be inferred later
|
||||
'''
|
||||
def __init__( self, state, timing=None ):
|
||||
self.state = state
|
||||
self.timing = timing
|
||||
|
||||
# Mutate class into the desired type
|
||||
if self.state in ['P', 'H', 'R', 'O', 'UP', 'UR']:
|
||||
self.__class__ = ButtonScheduleParam
|
||||
elif self.state in ['A', 'On', 'D', 'Off']:
|
||||
self.__class__ = IndicatorScheduleParam
|
||||
elif self.state is None and self.timing is not None:
|
||||
pass
|
||||
else:
|
||||
print( "{0} Invalid ScheduleParam state '{1}'".format( ERROR, self.state ) )
|
||||
|
||||
def setTiming( self, timing ):
|
||||
'''
|
||||
Set parameter timing
|
||||
'''
|
||||
self.timing = timing
|
||||
|
||||
def __repr__( self ):
|
||||
output = ""
|
||||
if self.state is None and self.timing is not None:
|
||||
output += "{0}".format( self.timing )
|
||||
else:
|
||||
output += "??"
|
||||
print( "{0} Unknown ScheduleParam state '{1}'".format( ERROR, self.state ) )
|
||||
return output
|
||||
|
||||
|
||||
class ButtonScheduleParam( ScheduleParam ):
|
||||
'''
|
||||
Button Schedule Parameter
|
||||
|
||||
Accepts:
|
||||
P - Press
|
||||
H - Hold
|
||||
R - Release
|
||||
O - Off
|
||||
UP - Unique Press
|
||||
UR - Unique Release
|
||||
|
||||
Timing specifiers are valid.
|
||||
Validity of specifiers are context dependent, and may error at a later stage, or be stripped altogether
|
||||
'''
|
||||
def __repr__( self ):
|
||||
output = "{0}".format( self.state )
|
||||
if self.timing is not None:
|
||||
output += ":{0}".format( self.timing )
|
||||
return output
|
||||
|
||||
|
||||
class AnalogScheduleParam( ScheduleParam ):
|
||||
'''
|
||||
Analog Schedule Parameter
|
||||
|
||||
Accepts:
|
||||
Value from 0 to 100, indicating a percentage pressed
|
||||
|
||||
XXX: Might be useful to accept decimal percentages
|
||||
'''
|
||||
def __init__( self, state ):
|
||||
self.state = state
|
||||
|
||||
def __repr__( self ):
|
||||
return "Analog({0})".format( self.state )
|
||||
|
||||
|
||||
class IndicatorScheduleParam( ScheduleParam ):
|
||||
'''
|
||||
Indicator Schedule Parameter
|
||||
|
||||
Accepts:
|
||||
A - Activate
|
||||
On
|
||||
D - Deactivate
|
||||
Off
|
||||
|
||||
Timing specifiers are valid.
|
||||
Validity of specifiers are context dependent, and may error at a later stage, or be stripped altogether
|
||||
'''
|
||||
def __repr__( self ):
|
||||
output = "{0}".format( self.state )
|
||||
if self.timing is not None:
|
||||
output += ":{0}".format( self.timing )
|
||||
return output
|
||||
|
2064
common/stage.py
Normal file
2064
common/stage.py
Normal file
File diff suppressed because it is too large
Load Diff
0
emitters/__init__.py
Normal file
0
emitters/__init__.py
Normal file
102
emitters/emitters.py
Normal file
102
emitters/emitters.py
Normal file
@ -0,0 +1,102 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Emitters Container Classes
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import emitters.kiibohd.kiibohd as kiibohd
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Emitters:
|
||||
'''
|
||||
Container class for KLL emitters
|
||||
|
||||
NOTES: To add a new emitter
|
||||
- Add a new directory for your emitter (e.g. kiibohd)
|
||||
- Add at least two files in this directory (<name>.py and __init__.py)
|
||||
- In <name>.py have one class that inherits the Emitter class from common.emitter
|
||||
- Add to list of emitters below
|
||||
- Add import statement to the top of this file
|
||||
- The control object allows access to the entire set of KLL datastructures
|
||||
'''
|
||||
|
||||
def __init__( self, control ):
|
||||
'''
|
||||
Emitter initialization
|
||||
|
||||
@param control: ControlStage object, used to access data from other stages
|
||||
'''
|
||||
# Default emitter
|
||||
self.default = "kiibohd"
|
||||
|
||||
# Dictionary of Emitters
|
||||
self.emitters = {
|
||||
'kiibohd' : kiibohd.Kiibohd( control )
|
||||
}
|
||||
|
||||
def emitter_default( self ):
|
||||
'''
|
||||
Returns string name of default emitter
|
||||
'''
|
||||
return self.default
|
||||
|
||||
def emitter_list( self ):
|
||||
'''
|
||||
List of emitters available
|
||||
'''
|
||||
return list( self.emitters.keys() )
|
||||
|
||||
def emitter( self, emitter ):
|
||||
'''
|
||||
Returns an emitter object
|
||||
'''
|
||||
return self.emitters[ emitter ]
|
||||
|
||||
def command_line_args( self, args ):
|
||||
'''
|
||||
Group parser fan-out for emitter command line arguments
|
||||
|
||||
@param args: Name space of processed arguments
|
||||
'''
|
||||
# Always process command line args in the same order
|
||||
for key, emitter in sorted( self.emitters.items(), key=lambda x: x[0] ):
|
||||
emitter.command_line_args( args )
|
||||
|
||||
def command_line_flags( self, parser ):
|
||||
'''
|
||||
Group parser fan-out for emitter for command line options
|
||||
|
||||
@param parser: argparse setup object
|
||||
'''
|
||||
# Always process command line flags in the same order
|
||||
for key, emitter in sorted( self.emitters.items(), key=lambda x: x[0] ):
|
||||
emitter.command_line_flags( parser )
|
||||
|
||||
|
0
emitters/kiibohd/__init__.py
Normal file
0
emitters/kiibohd/__init__.py
Normal file
466
emitters/kiibohd/kiibohd.py
Normal file
466
emitters/kiibohd/kiibohd.py
Normal file
@ -0,0 +1,466 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Kiibohd .h File Emitter
|
||||
'''
|
||||
|
||||
# Copyright (C) 2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import re
|
||||
import sys
|
||||
|
||||
from datetime import date
|
||||
|
||||
from common.emitter import Emitter, TextEmitter
|
||||
from common.hid_dict import kll_hid_lookup_dictionary
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
|
||||
### Classes ###
|
||||
|
||||
class Kiibohd( Emitter, TextEmitter ):
|
||||
'''
|
||||
Kiibohd .h file emitter for KLL
|
||||
'''
|
||||
|
||||
# List of required capabilities
|
||||
requiredCapabilities = {
|
||||
'CONS' : 'consCtrlOut',
|
||||
'NONE' : 'noneOut',
|
||||
'SYS' : 'sysCtrlOut',
|
||||
'USB' : 'usbKeyOut',
|
||||
}
|
||||
|
||||
def __init__( self, control ):
|
||||
'''
|
||||
Emitter initialization
|
||||
|
||||
@param control: ControlStage object, used to access data from other stages
|
||||
'''
|
||||
Emitter.__init__( self, control )
|
||||
TextEmitter.__init__( self )
|
||||
|
||||
# Defaults
|
||||
self.map_template = "templates/kiibohdKeymap.h"
|
||||
self.def_template = "templates/kiibohdDefs.h"
|
||||
self.map_output = "generatedKeymap.h"
|
||||
self.def_output = "kll_defs.h"
|
||||
|
||||
self.fill_dict = {}
|
||||
|
||||
def command_line_args( self, args ):
|
||||
'''
|
||||
Group parser for command line arguments
|
||||
|
||||
@param args: Name space of processed arguments
|
||||
'''
|
||||
self.def_template = args.def_template
|
||||
self.map_template = args.map_template
|
||||
self.def_output = args.def_output
|
||||
self.map_output = args.map_output
|
||||
|
||||
def command_line_flags( self, parser ):
|
||||
'''
|
||||
Group parser for command line options
|
||||
|
||||
@param parser: argparse setup object
|
||||
'''
|
||||
# Create new option group
|
||||
group = parser.add_argument_group('\033[1mKiibohd Emitter Configuration\033[0m')
|
||||
|
||||
group.add_argument( '--def-template', type=str, default=self.def_template,
|
||||
help="Specify KLL define .h file template.\n"
|
||||
"\033[1mDefault\033[0m: {0}\n".format( self.def_template )
|
||||
)
|
||||
group.add_argument( '--map-template', type=str, default=self.map_template,
|
||||
help="Specify KLL map .h file template.\n"
|
||||
"\033[1mDefault\033[0m: {0}\n".format( self.map_template )
|
||||
)
|
||||
group.add_argument( '--def-output', type=str, default=self.def_output,
|
||||
help="Specify KLL define .h file output.\n"
|
||||
"\033[1mDefault\033[0m: {0}\n".format( self.def_output )
|
||||
)
|
||||
group.add_argument( '--map-output', type=str, default=self.map_output,
|
||||
help="Specify KLL map .h file output.\n"
|
||||
"\033[1mDefault\033[0m: {0}\n".format( self.map_output )
|
||||
)
|
||||
|
||||
def output( self ):
|
||||
'''
|
||||
Final Stage of Emitter
|
||||
|
||||
Generate desired outputs from templates
|
||||
'''
|
||||
# Load define template and generate
|
||||
self.load_template( self.def_template )
|
||||
self.generate( self.def_output )
|
||||
|
||||
# Load keymap template and generate
|
||||
self.load_template( self.map_template )
|
||||
self.generate( self.map_output )
|
||||
|
||||
def process( self ):
|
||||
'''
|
||||
Emitter Processing
|
||||
|
||||
Takes KLL datastructures and Analysis results then populates the fill_dict
|
||||
The fill_dict is used populate the template files.
|
||||
'''
|
||||
# Acquire Datastructures
|
||||
early_contexts = self.control.stage('DataOrganizationStage').contexts
|
||||
base_context = self.control.stage('DataFinalizationStage').base_context
|
||||
default_context = self.control.stage('DataFinalizationStage').default_context
|
||||
partial_contexts = self.control.stage('DataFinalizationStage').partial_contexts
|
||||
full_context = self.control.stage('DataFinalizationStage').full_context
|
||||
|
||||
|
||||
# Build string list of compiler arguments
|
||||
compilerArgs = ""
|
||||
for arg in sys.argv:
|
||||
if "--" in arg or ".py" in arg:
|
||||
compilerArgs += "// {0}\n".format( arg )
|
||||
else:
|
||||
compilerArgs += "// {0}\n".format( arg )
|
||||
|
||||
|
||||
# Build a string of modified files, if any
|
||||
gitChangesStr = "\n"
|
||||
if len( self.control.git_changes ) > 0:
|
||||
for gitFile in self.control.git_changes:
|
||||
gitChangesStr += "// {0}\n".format( gitFile )
|
||||
else:
|
||||
gitChangesStr = " None\n"
|
||||
|
||||
|
||||
# Prepare BaseLayout and Layer Info
|
||||
configLayoutInfo = ""
|
||||
if 'ConfigurationContext' in early_contexts.keys():
|
||||
contexts = early_contexts['ConfigurationContext'].query_contexts( 'AssignmentExpression', 'Array' )
|
||||
for sub in contexts:
|
||||
name = sub[0].data['Name'].value
|
||||
configLayoutInfo += "// {0}\n// {1}\n".format( name, sub[1].parent.path )
|
||||
|
||||
genericLayoutInfo = ""
|
||||
if 'GenericContext' in early_contexts.keys():
|
||||
contexts = early_contexts['GenericContext'].query_contexts( 'AssignmentExpression', 'Array' )
|
||||
for sub in contexts:
|
||||
name = sub[0].data['Name'].value
|
||||
genericLayoutInfo += "// {0}\n// {1}\n".format( name, sub[1].parent.path )
|
||||
|
||||
baseLayoutInfo = ""
|
||||
if 'BaseMapContext' in early_contexts.keys():
|
||||
contexts = early_contexts['BaseMapContext'].query_contexts( 'AssignmentExpression', 'Array' )
|
||||
for sub in contexts:
|
||||
name = sub[0].data['Name'].value
|
||||
baseLayoutInfo += "// {0}\n// {1}\n".format( name, sub[1].parent.path )
|
||||
|
||||
defaultLayerInfo = ""
|
||||
if 'DefaultMapContext' in early_contexts.keys():
|
||||
contexts = early_contexts['DefaultMapContext'].query_contexts( 'AssignmentExpression', 'Array' )
|
||||
for sub in contexts:
|
||||
name = sub[0].data['Name'].value
|
||||
defaultLayerInfo += "// {0}\n// {1}\n".format( name, sub[1].parent.path )
|
||||
|
||||
partialLayersInfo = ""
|
||||
partial_context_list = [
|
||||
( item[1].layer, item[0] )
|
||||
for item in early_contexts.items()
|
||||
if 'PartialMapContext' in item[0]
|
||||
]
|
||||
for layer, tag in sorted( partial_context_list, key=lambda x: x[0] ):
|
||||
partialLayersInfo += "// Layer {0}\n".format( layer + 1 )
|
||||
contexts = early_contexts[ tag ].query_contexts( 'AssignmentExpression', 'Array' )
|
||||
for sub in contexts:
|
||||
name = sub[0].data['Name'].value
|
||||
partialLayersInfo += "// {0}\n// {1}\n".format( name, sub[1].parent.path )
|
||||
|
||||
|
||||
## Information ##
|
||||
self.fill_dict['Information'] = "// This file was generated by the kll compiler, DO NOT EDIT.\n"
|
||||
self.fill_dict['Information'] += "// Generation Date: {0}\n".format( date.today() )
|
||||
self.fill_dict['Information'] += "// KLL Emitter: {0}\n".format(
|
||||
self.control.stage('CompilerConfigurationStage').emitter
|
||||
)
|
||||
self.fill_dict['Information'] += "// KLL Version: {0}\n".format( self.control.version )
|
||||
self.fill_dict['Information'] += "// KLL Git Changes:{0}".format( gitChangesStr )
|
||||
self.fill_dict['Information'] += "// Compiler arguments:\n{0}".format( compilerArgs )
|
||||
self.fill_dict['Information'] += "//\n"
|
||||
self.fill_dict['Information'] += "// - Configuration File -\n{0}".format( configLayoutInfo )
|
||||
self.fill_dict['Information'] += "// - Generic Files -\n{0}".format( genericLayoutInfo )
|
||||
self.fill_dict['Information'] += "// - Base Layer -\n{0}".format( baseLayoutInfo )
|
||||
self.fill_dict['Information'] += "// - Default Layer -\n{0}".format( defaultLayerInfo )
|
||||
self.fill_dict['Information'] += "// - Partial Layers -\n{0}".format( partialLayersInfo )
|
||||
|
||||
|
||||
## Defines ##
|
||||
self.fill_dict['Defines'] = ""
|
||||
|
||||
# Iterate through defines and lookup the variables
|
||||
defines = full_context.query( 'NameAssociationExpression', 'Define' )
|
||||
variables = full_context.query( 'AssignmentExpression', 'Variable' )
|
||||
for dkey, dvalue in sorted( defines.data.items() ):
|
||||
if dvalue.name in variables.data.keys():
|
||||
self.fill_dict['Defines'] += "\n#define {0} {1}".format(
|
||||
dvalue.association,
|
||||
variables.data[ dvalue.name ].value.replace( '\n', ' \\\n' ),
|
||||
)
|
||||
else:
|
||||
print( "{0} '{1}' not defined...".format( WARNING, dvalue.name ) )
|
||||
|
||||
|
||||
## Capabilities ##
|
||||
self.fill_dict['CapabilitiesFuncDecl'] = ""
|
||||
self.fill_dict['CapabilitiesList'] = "const Capability CapabilitiesList[] = {\n"
|
||||
self.fill_dict['CapabilitiesIndices'] = "typedef enum CapabilityIndex {\n"
|
||||
|
||||
# Keys are pre-sorted
|
||||
capabilities = full_context.query( 'NameAssociationExpression', 'Capability' )
|
||||
for dkey, dvalue in sorted( capabilities.data.items() ):
|
||||
funcName = dvalue.association.name
|
||||
argByteWidth = dvalue.association.total_arg_bytes()
|
||||
|
||||
self.fill_dict['CapabilitiesList'] += "\t{{ {0}, {1} }},\n".format( funcName, argByteWidth )
|
||||
self.fill_dict['CapabilitiesFuncDecl'] += \
|
||||
"void {0}( uint8_t state, uint8_t stateType, uint8_t *args );\n".format( funcName )
|
||||
self.fill_dict['CapabilitiesIndices'] += "\t{0}_index,\n".format( funcName )
|
||||
|
||||
self.fill_dict['CapabilitiesList'] += "};"
|
||||
self.fill_dict['CapabilitiesIndices'] += "} CapabilityIndex;"
|
||||
return
|
||||
|
||||
|
||||
## Results Macros ##
|
||||
self.fill_dict['ResultMacros'] = ""
|
||||
|
||||
# Iterate through each of the result macros
|
||||
for result in range( 0, len( macros.resultsIndexSorted ) ):
|
||||
self.fill_dict['ResultMacros'] += "Guide_RM( {0} ) = {{ ".format( result )
|
||||
|
||||
# Add the result macro capability index guide (including capability arguments)
|
||||
# See kiibohd controller Macros/PartialMap/kll.h for exact formatting details
|
||||
for sequence in range( 0, len( macros.resultsIndexSorted[ result ] ) ):
|
||||
# If the sequence is longer than 1, prepend a sequence spacer
|
||||
# Needed for USB behaviour, otherwise, repeated keys will not work
|
||||
if sequence > 0:
|
||||
# <single element>, <usbCodeSend capability>, <USB Code 0x00>
|
||||
self.fill_dict['ResultMacros'] += "1, {0}, 0x00, ".format( capabilities.getIndex( self.capabilityLookup('USB') ) )
|
||||
|
||||
# For each combo in the sequence, add the length of the combo
|
||||
self.fill_dict['ResultMacros'] += "{0}, ".format( len( macros.resultsIndexSorted[ result ][ sequence ] ) )
|
||||
|
||||
# For each combo, add each of the capabilities used and their arguments
|
||||
for combo in range( 0, len( macros.resultsIndexSorted[ result ][ sequence ] ) ):
|
||||
resultItem = macros.resultsIndexSorted[ result ][ sequence ][ combo ]
|
||||
|
||||
# Add the capability index
|
||||
self.fill_dict['ResultMacros'] += "{0}, ".format( capabilities.getIndex( resultItem[0] ) )
|
||||
|
||||
# Add each of the arguments of the capability
|
||||
for arg in range( 0, len( resultItem[1] ) ):
|
||||
# Special cases
|
||||
if isinstance( resultItem[1][ arg ], str ):
|
||||
# If this is a CONSUMER_ element, needs to be split into 2 elements
|
||||
# AC_ and AL_ are other sections of consumer control
|
||||
if re.match( r'^(CONSUMER|AC|AL)_', resultItem[1][ arg ] ):
|
||||
tag = resultItem[1][ arg ].split( '_', 1 )[1]
|
||||
if '_' in tag:
|
||||
tag = tag.replace( '_', '' )
|
||||
try:
|
||||
lookupNum = kll_hid_lookup_dictionary['ConsCode'][ tag ][1]
|
||||
except KeyError as err:
|
||||
print ( "{0} {1} Consumer HID kll bug...please report.".format( ERROR, err ) )
|
||||
raise
|
||||
byteForm = lookupNum.to_bytes( 2, byteorder='little' ) # XXX Yes, little endian from how the uC structs work
|
||||
self.fill_dict['ResultMacros'] += "{0}, {1}, ".format( *byteForm )
|
||||
continue
|
||||
|
||||
# None, fall-through disable
|
||||
elif resultItem[0] is self.capabilityLookup('NONE'):
|
||||
continue
|
||||
|
||||
self.fill_dict['ResultMacros'] += "{0}, ".format( resultItem[1][ arg ] )
|
||||
|
||||
# If sequence is longer than 1, append a sequence spacer at the end of the sequence
|
||||
# Required by USB to end at sequence without holding the key down
|
||||
if len( macros.resultsIndexSorted[ result ] ) > 1:
|
||||
# <single element>, <usbCodeSend capability>, <USB Code 0x00>
|
||||
self.fill_dict['ResultMacros'] += "1, {0}, 0x00, ".format( capabilities.getIndex( self.capabilityLookup('USB') ) )
|
||||
|
||||
# Add list ending 0 and end of list
|
||||
self.fill_dict['ResultMacros'] += "0 };\n"
|
||||
self.fill_dict['ResultMacros'] = self.fill_dict['ResultMacros'][:-1] # Remove last newline
|
||||
|
||||
|
||||
## Result Macro List ##
|
||||
self.fill_dict['ResultMacroList'] = "const ResultMacro ResultMacroList[] = {\n"
|
||||
|
||||
# Iterate through each of the result macros
|
||||
for result in range( 0, len( macros.resultsIndexSorted ) ):
|
||||
self.fill_dict['ResultMacroList'] += "\tDefine_RM( {0} ),\n".format( result )
|
||||
self.fill_dict['ResultMacroList'] += "};"
|
||||
|
||||
|
||||
## Result Macro Record ##
|
||||
self.fill_dict['ResultMacroRecord'] = "ResultMacroRecord ResultMacroRecordList[ ResultMacroNum ];"
|
||||
|
||||
|
||||
## Trigger Macros ##
|
||||
self.fill_dict['TriggerMacros'] = ""
|
||||
|
||||
# Iterate through each of the trigger macros
|
||||
for trigger in range( 0, len( macros.triggersIndexSorted ) ):
|
||||
self.fill_dict['TriggerMacros'] += "Guide_TM( {0} ) = {{ ".format( trigger )
|
||||
|
||||
# Add the trigger macro scan code guide
|
||||
# See kiibohd controller Macros/PartialMap/kll.h for exact formatting details
|
||||
for sequence in range( 0, len( macros.triggersIndexSorted[ trigger ][0] ) ):
|
||||
# For each combo in the sequence, add the length of the combo
|
||||
self.fill_dict['TriggerMacros'] += "{0}, ".format( len( macros.triggersIndexSorted[ trigger ][0][ sequence ] ) )
|
||||
|
||||
# For each combo, add the key type, key state and scan code
|
||||
for combo in range( 0, len( macros.triggersIndexSorted[ trigger ][0][ sequence ] ) ):
|
||||
triggerItemId = macros.triggersIndexSorted[ trigger ][0][ sequence ][ combo ]
|
||||
|
||||
# Lookup triggerItem in ScanCodeStore
|
||||
triggerItemObj = macros.scanCodeStore[ triggerItemId ]
|
||||
triggerItem = triggerItemObj.offset( macros.interconnectOffset )
|
||||
|
||||
# TODO Add support for Analog keys
|
||||
# TODO Add support for LED states
|
||||
self.fill_dict['TriggerMacros'] += "0x00, 0x01, 0x{0:02X}, ".format( triggerItem )
|
||||
|
||||
# Add list ending 0 and end of list
|
||||
self.fill_dict['TriggerMacros'] += "0 };\n"
|
||||
self.fill_dict['TriggerMacros'] = self.fill_dict['TriggerMacros'][ :-1 ] # Remove last newline
|
||||
|
||||
|
||||
## Trigger Macro List ##
|
||||
self.fill_dict['TriggerMacroList'] = "const TriggerMacro TriggerMacroList[] = {\n"
|
||||
|
||||
# Iterate through each of the trigger macros
|
||||
for trigger in range( 0, len( macros.triggersIndexSorted ) ):
|
||||
# Use TriggerMacro Index, and the corresponding ResultMacro Index
|
||||
self.fill_dict['TriggerMacroList'] += "\tDefine_TM( {0}, {1} ),\n".format( trigger, macros.triggersIndexSorted[ trigger ][1] )
|
||||
self.fill_dict['TriggerMacroList'] += "};"
|
||||
|
||||
|
||||
## Trigger Macro Record ##
|
||||
self.fill_dict['TriggerMacroRecord'] = "TriggerMacroRecord TriggerMacroRecordList[ TriggerMacroNum ];"
|
||||
|
||||
|
||||
## Max Scan Code ##
|
||||
self.fill_dict['MaxScanCode'] = "#define MaxScanCode 0x{0:X}".format( macros.overallMaxScanCode )
|
||||
|
||||
|
||||
## Interconnect ScanCode Offset List ##
|
||||
self.fill_dict['ScanCodeInterconnectOffsetList'] = "const uint8_t InterconnectOffsetList[] = {\n"
|
||||
for offset in range( 0, len( macros.interconnectOffset ) ):
|
||||
self.fill_dict['ScanCodeInterconnectOffsetList'] += "\t0x{0:02X},\n".format( macros.interconnectOffset[ offset ] )
|
||||
self.fill_dict['ScanCodeInterconnectOffsetList'] += "};"
|
||||
|
||||
|
||||
## Max Interconnect Nodes ##
|
||||
self.fill_dict['InterconnectNodeMax'] = "#define InterconnectNodeMax 0x{0:X}\n".format( len( macros.interconnectOffset ) )
|
||||
|
||||
|
||||
## Default Layer and Default Layer Scan Map ##
|
||||
self.fill_dict['DefaultLayerTriggerList'] = ""
|
||||
self.fill_dict['DefaultLayerScanMap'] = "const nat_ptr_t *default_scanMap[] = {\n"
|
||||
|
||||
# Iterate over triggerList and generate a C trigger array for the default map and default map array
|
||||
for triggerList in range( macros.firstScanCode[0], len( macros.triggerList[0] ) ):
|
||||
# Generate ScanCode index and triggerList length
|
||||
self.fill_dict['DefaultLayerTriggerList'] += "Define_TL( default, 0x{0:02X} ) = {{ {1}".format( triggerList, len( macros.triggerList[0][ triggerList ] ) )
|
||||
|
||||
# Add scanCode trigger list to Default Layer Scan Map
|
||||
self.fill_dict['DefaultLayerScanMap'] += "default_tl_0x{0:02X}, ".format( triggerList )
|
||||
|
||||
# Add each item of the trigger list
|
||||
for triggerItem in macros.triggerList[0][ triggerList ]:
|
||||
self.fill_dict['DefaultLayerTriggerList'] += ", {0}".format( triggerItem )
|
||||
|
||||
self.fill_dict['DefaultLayerTriggerList'] += " };\n"
|
||||
self.fill_dict['DefaultLayerTriggerList'] = self.fill_dict['DefaultLayerTriggerList'][:-1] # Remove last newline
|
||||
self.fill_dict['DefaultLayerScanMap'] = self.fill_dict['DefaultLayerScanMap'][:-2] # Remove last comma and space
|
||||
self.fill_dict['DefaultLayerScanMap'] += "\n};"
|
||||
|
||||
|
||||
## Partial Layers and Partial Layer Scan Maps ##
|
||||
self.fill_dict['PartialLayerTriggerLists'] = ""
|
||||
self.fill_dict['PartialLayerScanMaps'] = ""
|
||||
|
||||
# Iterate over each of the layers, excluding the default layer
|
||||
for layer in range( 1, len( macros.triggerList ) ):
|
||||
# Prepare each layer
|
||||
self.fill_dict['PartialLayerScanMaps'] += "// Partial Layer {0}\n".format( layer )
|
||||
self.fill_dict['PartialLayerScanMaps'] += "const nat_ptr_t *layer{0}_scanMap[] = {{\n".format( layer )
|
||||
self.fill_dict['PartialLayerTriggerLists'] += "// Partial Layer {0}\n".format( layer )
|
||||
|
||||
# Iterate over triggerList and generate a C trigger array for the layer
|
||||
for triggerList in range( macros.firstScanCode[ layer ], len( macros.triggerList[ layer ] ) ):
|
||||
# Generate ScanCode index and triggerList length
|
||||
self.fill_dict['PartialLayerTriggerLists'] += "Define_TL( layer{0}, 0x{1:02X} ) = {{ {2}".format( layer, triggerList, len( macros.triggerList[ layer ][ triggerList ] ) )
|
||||
|
||||
# Add scanCode trigger list to Default Layer Scan Map
|
||||
self.fill_dict['PartialLayerScanMaps'] += "layer{0}_tl_0x{1:02X}, ".format( layer, triggerList )
|
||||
|
||||
# Add each item of the trigger list
|
||||
for trigger in macros.triggerList[ layer ][ triggerList ]:
|
||||
self.fill_dict['PartialLayerTriggerLists'] += ", {0}".format( trigger )
|
||||
|
||||
self.fill_dict['PartialLayerTriggerLists'] += " };\n"
|
||||
self.fill_dict['PartialLayerTriggerLists'] += "\n"
|
||||
self.fill_dict['PartialLayerScanMaps'] = self.fill_dict['PartialLayerScanMaps'][:-2] # Remove last comma and space
|
||||
self.fill_dict['PartialLayerScanMaps'] += "\n};\n\n"
|
||||
self.fill_dict['PartialLayerTriggerLists'] = self.fill_dict['PartialLayerTriggerLists'][:-2] # Remove last 2 newlines
|
||||
self.fill_dict['PartialLayerScanMaps'] = self.fill_dict['PartialLayerScanMaps'][:-2] # Remove last 2 newlines
|
||||
|
||||
|
||||
## Layer Index List ##
|
||||
self.fill_dict['LayerIndexList'] = "const Layer LayerIndex[] = {\n"
|
||||
|
||||
# Iterate over each layer, adding it to the list
|
||||
for layer in range( 0, len( macros.triggerList ) ):
|
||||
# Lookup first scancode in map
|
||||
firstScanCode = macros.firstScanCode[ layer ]
|
||||
|
||||
# Generate stacked name
|
||||
stackName = ""
|
||||
if '*NameStack' in variables.layerVariables[ layer ].keys():
|
||||
for name in range( 0, len( variables.layerVariables[ layer ]['*NameStack'] ) ):
|
||||
stackName += "{0} + ".format( variables.layerVariables[ layer ]['*NameStack'][ name ] )
|
||||
stackName = stackName[:-3]
|
||||
|
||||
# Default map is a special case, always the first index
|
||||
if layer == 0:
|
||||
self.fill_dict['LayerIndexList'] += '\tLayer_IN( default_scanMap, "D: {1}", 0x{0:02X} ),\n'.format( firstScanCode, stackName )
|
||||
else:
|
||||
self.fill_dict['LayerIndexList'] += '\tLayer_IN( layer{0}_scanMap, "{0}: {2}", 0x{1:02X} ),\n'.format( layer, firstScanCode, stackName )
|
||||
self.fill_dict['LayerIndexList'] += "};"
|
||||
|
||||
|
||||
## Layer State ##
|
||||
self.fill_dict['LayerState'] = "uint8_t LayerState[ LayerNum ];"
|
||||
|
5
examples/assignment.kll
Normal file
5
examples/assignment.kll
Normal file
@ -0,0 +1,5 @@
|
||||
Variable = 1;
|
||||
Array[] = a b c "b c" 3;
|
||||
Index[5] = "this text" thing; # Single element
|
||||
Index[6] = moar;
|
||||
|
@ -8,7 +8,7 @@ Date = 2014-06-12;
|
||||
|
||||
S0x40 : U"Backspace";
|
||||
|
||||
S0x42 : U"}";
|
||||
S0x42 : U"]";
|
||||
S0x43 : U"Delete";
|
||||
S0x44 : U"Enter";
|
||||
|
||||
@ -16,7 +16,7 @@ S0x46 : U"RShift";
|
||||
S0x47 : U"RCtrl";
|
||||
S0x48 : U"=";
|
||||
S0x49 : U"-";
|
||||
S0x4A : U"{";
|
||||
S0x4A : U"[";
|
||||
S0x4B : U"\";
|
||||
S0x4C : U"'";
|
||||
S0x4D : U"/";
|
||||
|
@ -10,25 +10,25 @@ test => myCFunc( dat : 1 );
|
||||
|
||||
U"A" : U"B";
|
||||
# Top row
|
||||
#'e' : 'f';
|
||||
#'r' : 'p';
|
||||
#'t' : 'g';
|
||||
#'y' : 'j';
|
||||
#'u' : 'l';
|
||||
#'i' : 'u';
|
||||
#'o' : 'y';
|
||||
#'p' : ';';
|
||||
'e' : 'f';
|
||||
'r' : 'p';
|
||||
't' : 'g';
|
||||
'y' : 'j';
|
||||
'u' : 'l';
|
||||
'i' : 'u';
|
||||
'o' : 'y';
|
||||
'p' : ';';
|
||||
|
||||
# Middle Row
|
||||
#'s' : 'r';
|
||||
#'d' : 's';
|
||||
#'f' : 't';
|
||||
#'g' : 'd';
|
||||
#'j' : 'n';
|
||||
#'k' : 'e';
|
||||
#'l' : 'i';
|
||||
#';' : 'o';
|
||||
's' : 'r';
|
||||
'd' : 's';
|
||||
'f' : 't';
|
||||
'g' : 'd';
|
||||
'j' : 'n';
|
||||
'k' : 'e';
|
||||
'l' : 'i';
|
||||
';' : 'o';
|
||||
|
||||
# Bottom Row
|
||||
#'n' : 'k';
|
||||
'n' : 'k';
|
||||
|
||||
|
@ -27,7 +27,7 @@ U"Tab" : U"Capslock";
|
||||
U"I" : U"PrintScreen";
|
||||
U"O" : U"ScrollLock";
|
||||
U"P" : U"Pause";
|
||||
U"{" : U"Up";
|
||||
U"[" : U"Up";
|
||||
|
||||
# Middle Row
|
||||
U"A" : U"VolumeDown";
|
||||
|
@ -12,15 +12,19 @@ myarray[4] = test;
|
||||
# Key Positioning
|
||||
S120 <= x:20, rx:15;
|
||||
S121 <= x:20, y:10, z:2, rx:15, ry:12, rz:39;
|
||||
S[122-125] <= x:20, rx:15;
|
||||
|
||||
# Pixel Positioning
|
||||
P19 <= x:21, rx:16;
|
||||
P[20] <= x:20, rx:15;
|
||||
P[21] <= x:20, y:10, z:2, rx:15, ry:12, rz:39;
|
||||
P[22-25] <= x:20, rx:15;
|
||||
|
||||
# Pixel Channel Mapping
|
||||
P[5](4:8, 5:8, 12:8) : None;
|
||||
P[4](3:8) : S0x31;
|
||||
P[12](40:8, 50:8, 120:8) : S59;
|
||||
P[12](12:8, 13:8, 14:8) : S[40];
|
||||
|
||||
# Animation
|
||||
A[BLEEdsing] <= loop:3,frame:2;
|
||||
@ -34,6 +38,9 @@ A[BLEEdsing, 3] <= P[4](-:32);
|
||||
A[BLEEdsing, 4] <= P[4](+:400);
|
||||
A[BLEEdsing, 5] <= P[4](<<2);
|
||||
A[BLEEdsing, 6] <= P[4](>>1);
|
||||
A[BLEEdsing, 7-9] <= P[4](+32);
|
||||
A[BLEEdsing, 10, 12] <= P[4](+32);
|
||||
A[BLEEdsing, 11, 13-15] <= P[4-10](+32);
|
||||
|
||||
A[BLEEdsing2, 0] <= PL[0](127, 30, 40), P[5](20, 30, 40);
|
||||
A[BLEEdsing2, 1] <= P[1-20,40](40,50,0x60);
|
||||
@ -41,10 +48,20 @@ A[BLEEdsing2, 1] <= P[1-20,40](40,50,0x60);
|
||||
# Animation Triggers
|
||||
myCapability => myFunc( myArg1 : 1, myArg2 : 4 );
|
||||
A[BLEEdsing, 3] : myCapability( 0x8, 0x25 );
|
||||
A[BLEEdsing, 4-6] : myCapability( 0x8, 0x25 );
|
||||
A[BLEEdsing, 7, 9] : myCapability( 0x8, 0x25 );
|
||||
A[BLEEdsing2] : myCapability( 0x8, 0x25 );
|
||||
|
||||
# Animation Results
|
||||
U0x40 : A[BLEEdsing];
|
||||
S[0x37, 0x38] : A"BLEEdsing";
|
||||
S0x39 : A"BLEEdsing", A"BLEEdsing2";
|
||||
S0x40 : A[BLEEdsing];
|
||||
S0x41 : A[BLEEdsing](loop:2);
|
||||
S0x43 : PL[0](0xFF,0xFF,244), P[1-3](20,40,60);
|
||||
S0x42 : A[BLEEdsing](loop:2,div:3);
|
||||
S0x43 : PL[0](0xFF,0xFF,244) + P[1-3](20,40,60);
|
||||
S0x44 : PL[0-2](0xFF,0xFF,244);
|
||||
S0x44 : PL1(0xFF,0xFF,244);
|
||||
S0x45 : PL[2](0x1F,0x2F,0x3F);
|
||||
S0x46 : P[0](11,23,45);
|
||||
S0x47 : P1(11,23,45);
|
||||
|
||||
|
57
examples/mapping.kll
Normal file
57
examples/mapping.kll
Normal file
@ -0,0 +1,57 @@
|
||||
Name = mapping test;
|
||||
Author = "HaaTa (Jacob Alexander) 2016";
|
||||
KLL = 0.5b;
|
||||
|
||||
# Mapping Operators ##
|
||||
|
||||
S0x01 : U"A";
|
||||
S0x01 : U"A";
|
||||
|
||||
# Must include file twice for this case to work
|
||||
S0x02 : U"B";
|
||||
U"B" :: U"C";
|
||||
|
||||
S0x03 :+ U"D";
|
||||
|
||||
S0x04 : U"E";
|
||||
S0x04 :+ U"F";
|
||||
S0x04 :+ U"F";
|
||||
S0x04 :+ U"L";
|
||||
|
||||
S0x06 : U"G";
|
||||
S0x06 :+ U"H";
|
||||
S0x06 :- U"G";
|
||||
|
||||
S0x07 + S0x08 : U"H";
|
||||
S0x09, S0x0A : U"I";
|
||||
|
||||
S[0x0B, 0x0B, 0x0C] : U"J";
|
||||
|
||||
S0x0D :- U"K";
|
||||
|
||||
|
||||
## Isolation Mappings ##
|
||||
|
||||
S0x10 i: U"M";
|
||||
S0x10 i: U"M";
|
||||
|
||||
# Must include file twice for this case to work
|
||||
S0x11 i: U"N";
|
||||
U"N" i:: U"O";
|
||||
|
||||
S0x12 i:+ U"P";
|
||||
|
||||
S0x13 i: U"Q";
|
||||
S0x13 i:+ U"R";
|
||||
|
||||
S0x14 i: U"S";
|
||||
S0x14 i:+ U"T";
|
||||
S0x14 i:- U"S";
|
||||
|
||||
S0x15 + S0x16 i: U"U";
|
||||
S0x17, S0x18 i: U"V";
|
||||
|
||||
S[0x19, 0x1A] i: U"W";
|
||||
|
||||
S0x1B i:- U"X";
|
||||
|
@ -41,6 +41,6 @@ S[ 0x7 - 0x9 ] : U"6";
|
||||
S[ 0x2 - 0x9, 0x10 ] : U"r";
|
||||
S[ 0x2 - 0x9, 0x10 ]+S[0x5 - 0x6, 0x9],S0xA+S0xB : U"r";
|
||||
|
||||
S0x42 : U"}";
|
||||
S0x42 : U"]";
|
||||
S0x42 : U"Esc";
|
||||
|
||||
|
@ -18,17 +18,18 @@ myCapability3 => myFunc3( myArg1 : 2 );
|
||||
myCapability => myFunc( myArg1 : 1, myArg2 : 4 );
|
||||
|
||||
S0x3 : myCapability2();
|
||||
S0x4 : myCapability( 0x8, 0x25 );
|
||||
S[0x4] : myCapability( 0x8, 0x25 );
|
||||
S[ 0x7 - 0x9 ] : U"6";
|
||||
S0x40 : U[0x1-0x4];
|
||||
S0x12 : U[122] + U[123];
|
||||
S0x6 : 'abcdDfF'; # TODO
|
||||
S0x40 : U[0x1];
|
||||
S0x40 : U[0x1-0x4];
|
||||
S0x0B : U["Esc"];
|
||||
S0x0B :+ U["Q"];
|
||||
S[ 0x7 - 0x9 ] : U"6";
|
||||
S[ 0x7 - 0x9 ], S[0x2,0x3] : U"6";
|
||||
S[ 0x2 - 0x9, 0x10 ] :+ U"r";
|
||||
S0x0B :- U["Esc"];
|
||||
S[ 0x3 - 0x4 ] + S[ 0x10 ], S[ 0x20 ] : U"Enter";
|
||||
S127 + S128 : U"0";
|
||||
|
||||
S0x41 : CONS[0x30];
|
||||
@ -39,5 +40,10 @@ S0x45 : SYS[0xA0];
|
||||
S0x46 : SYS["UnDock"];
|
||||
S0x47 : SYS0xA2;
|
||||
|
||||
S0x48 : None;
|
||||
S[0x48] : None;
|
||||
S0x30(P) : U"A";
|
||||
S0x29(P:10ms) : U"A";
|
||||
S0x28(20) : U"A";
|
||||
S0x31(H:20ms, R:1s) : U"B";
|
||||
S0x32(P,H,R) : U"B";
|
||||
|
||||
|
@ -3,9 +3,9 @@ Author = "HaaTa (Jacob Alexander) 2014-2015";
|
||||
KLL = 0.3a;
|
||||
|
||||
usbKeyOut => Output_usbCodeSend_capability( usbCode : 1 );
|
||||
#S0x40 : U0x43;
|
||||
S0x40 : U0x43;
|
||||
S0x40 : U"Backspace";
|
||||
|
||||
S0x42 : U"}";
|
||||
S0x42 : U"]";
|
||||
S0x42 : U"Esc";
|
||||
|
||||
|
53
examples/state_scheduling.kll
Normal file
53
examples/state_scheduling.kll
Normal file
@ -0,0 +1,53 @@
|
||||
Name = "State Scheduling";
|
||||
Author = "HaaTa (Jacob Alexander) 2016";
|
||||
KLL = 0.4;
|
||||
mydefine = "stuffs here";
|
||||
mydefine2 = '"stuffs here"'; # For outputting c define strings
|
||||
mynumber = 414;
|
||||
|
||||
# State Scheduling
|
||||
S0x43 : U"Enter";
|
||||
S[0x43(P,UP,UR)] : U"Enter";
|
||||
S0x44(P) : U"Enter";
|
||||
S0x45(UP) : U"Enter";
|
||||
S0x46(UR) : U"Enter";
|
||||
S0x46(R) : U"Enter";
|
||||
|
||||
S0x47(H) + S0x48 : U"Enter";
|
||||
S0x49(O) + S0x50 : U"Enter";
|
||||
|
||||
# Timing Triggers
|
||||
U"t"(300ms) : 'duuude';
|
||||
U"t"(30.2ms) : 'duuude';
|
||||
U"i"(200) : 'duuude1';
|
||||
U"u"(1s) : 'duuud2e';
|
||||
U"m"(40us) : 'duuu3de';
|
||||
|
||||
U"a" + U"b"(P:1s) : 'slow';
|
||||
U"a" + U"b"(P:50ms,H:100ms,R:200ms) : 'fast';
|
||||
|
||||
# Timing Results
|
||||
U"x" : U"a"(300ms);
|
||||
U"v" : U"a"(P,H:300ms,R);
|
||||
|
||||
# Analog
|
||||
S0x2A(10) : U"B";
|
||||
S0x2A(80) : U"C";
|
||||
S[34-52](22) : 'boo';
|
||||
S[34-52(88)](22) : 'beh';
|
||||
S[34-52(88), 78](30) : 'joe';
|
||||
U"A"(0) : U"A"; # Pulse
|
||||
U"A"(42) : U"Q";
|
||||
U["1"-"5"(42), "Tab"](30) : 'mac';
|
||||
|
||||
|
||||
# Indicators
|
||||
I"NumLock" : U"Space";
|
||||
I"NumLock"(A) : U"Space";
|
||||
I"NumLock"(D) : U"Z";
|
||||
I2 : U"G"; # CapsLock
|
||||
|
||||
U"a" + I"NumLock"(Off) : U"Q";
|
||||
U"a" + I"NumLock"(On) : U"W";
|
||||
|
||||
|
@ -1,7 +1,7 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
|
||||
# Copyright (c) 2008/2013 Andrey Vlasovskikh
|
||||
# Small Python 3 modifications by Jacob Alexander 2014
|
||||
# Modifications by Jacob Alexander 2014, 2016
|
||||
#
|
||||
# Permission is hereby granted, free of charge, to any person obtaining
|
||||
# a copy of this software and associated documentation files (the
|
||||
@ -69,11 +69,30 @@ __all__ = [
|
||||
|
||||
import logging
|
||||
|
||||
log = logging.getLogger('funcparserlib')
|
||||
|
||||
log = logging.getLogger('funcparserlib')
|
||||
debug = False
|
||||
|
||||
|
||||
def Parser_debug(enable, stream=None):
|
||||
'''
|
||||
Enables/Disables debug logger for parser.py
|
||||
|
||||
NOTE: This is not really multi-thread friendly
|
||||
|
||||
@param stream: StringIO stream to use
|
||||
@param enable: Enable/disable debug stream
|
||||
'''
|
||||
global debug
|
||||
debug = enable
|
||||
|
||||
if enable:
|
||||
logging.raiseExceptions = False
|
||||
log.setLevel(logging.DEBUG)
|
||||
ch = logging.StreamHandler(stream)
|
||||
log.addHandler(ch)
|
||||
|
||||
|
||||
class Parser(object):
|
||||
"""A wrapper around a parser function that defines some operators for parser
|
||||
composition.
|
||||
@ -103,7 +122,12 @@ class Parser(object):
|
||||
Runs a parser wrapped into this object.
|
||||
"""
|
||||
if debug:
|
||||
log.debug('trying %s' % self.name)
|
||||
# Truncate at 500 characters
|
||||
# Any longer isn't that useful and makes the output hard to read
|
||||
output = 'trying %s' % self.name
|
||||
if len( output ) > 500:
|
||||
output = output[:250] + ' ... [truncated] ... ' + output[-250:]
|
||||
log.debug(output)
|
||||
return self._run(tokens, s)
|
||||
|
||||
def _run(self, tokens, s):
|
||||
@ -234,7 +258,7 @@ class State(object):
|
||||
self.max = max
|
||||
|
||||
def __str__(self):
|
||||
return unicode((self.pos, self.max))
|
||||
return str((self.pos, self.max))
|
||||
|
||||
def __repr__(self):
|
||||
return 'State(%r, %r)' % (self.pos, self.max)
|
||||
|
151
kll
Executable file
151
kll
Executable file
@ -0,0 +1,151 @@
|
||||
#!/usr/bin/env python3
|
||||
'''
|
||||
KLL Compiler
|
||||
Keyboard Layout Langauge
|
||||
'''
|
||||
|
||||
# Copyright (C) 2014-2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This file is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this file. If not, see <http://www.gnu.org/licenses/>.
|
||||
|
||||
### Imports ###
|
||||
|
||||
import argparse
|
||||
import importlib
|
||||
import os
|
||||
import sys
|
||||
|
||||
import common.stage as stage
|
||||
|
||||
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
WARNING = '\033[5;1;33mWARNING\033[0m:'
|
||||
|
||||
|
||||
## Python Text Formatting Fixer...
|
||||
## Because the creators of Python are averse to proper capitalization.
|
||||
textFormatter_lookup = {
|
||||
"usage: " : "Usage: ",
|
||||
"optional arguments" : "\033[1mOptional Arguments\033[0m",
|
||||
}
|
||||
|
||||
def textFormatter_gettext( s ):
|
||||
return textFormatter_lookup.get( s, s )
|
||||
|
||||
argparse._ = textFormatter_gettext
|
||||
|
||||
|
||||
|
||||
### Misc Utility Functions ###
|
||||
|
||||
def git_revision( kllPath ):
|
||||
import subprocess
|
||||
|
||||
# Change the path to where kll.py is
|
||||
origPath = os.getcwd()
|
||||
os.chdir( kllPath )
|
||||
|
||||
# Just in case git can't be found
|
||||
try:
|
||||
# Get hash of the latest git commit
|
||||
revision = subprocess.check_output( ['git', 'rev-parse', 'HEAD'] ).decode()[:-1]
|
||||
|
||||
# Get list of files that have changed since the commit
|
||||
changed = subprocess.check_output( ['git', 'diff-index', '--name-only', 'HEAD', '--'] ).decode().splitlines()
|
||||
|
||||
# Get commit date
|
||||
date = subprocess.check_output( ['git', 'show', '-s', '--format=%ci'] ).decode()[:-1]
|
||||
except:
|
||||
revision = "<no git>"
|
||||
changed = []
|
||||
date = "<no date>"
|
||||
|
||||
# Change back to the old working directory
|
||||
os.chdir( origPath )
|
||||
|
||||
return revision, changed, date
|
||||
|
||||
|
||||
|
||||
### Argument Parsing ###
|
||||
|
||||
def checkFileExists( filename ):
|
||||
if not os.path.isfile( filename ):
|
||||
print ( "{0} {1} does not exist...".format( ERROR, filename ) )
|
||||
sys.exit( 1 )
|
||||
|
||||
def command_line_args( control ):
|
||||
'''
|
||||
Initialize argparse and process all command line arguments
|
||||
|
||||
@param control: ControlStage object which has access to all the group argument parsers
|
||||
'''
|
||||
# Setup argument processor
|
||||
parser = argparse.ArgumentParser(
|
||||
usage="%(prog)s [options..] [<generic>..]",
|
||||
description="KLL Compiler - Generates specified output from KLL .kll files.",
|
||||
epilog="Example: {0} TODO".format( os.path.basename( sys.argv[0] ) ),
|
||||
formatter_class=argparse.RawTextHelpFormatter,
|
||||
add_help=False,
|
||||
)
|
||||
|
||||
# Get git information
|
||||
control.git_rev, control.git_changes, control.git_date = git_revision(
|
||||
os.path.dirname( os.path.realpath( __file__ ) )
|
||||
)
|
||||
control.version = "ALPHA 0.5c.{0} - {1}".format( control.git_rev, control.git_date )
|
||||
|
||||
# Optional Arguments
|
||||
parser.add_argument(
|
||||
'-h', '--help',
|
||||
action="help",
|
||||
help="This message."
|
||||
)
|
||||
parser.add_argument(
|
||||
'-v', '--version',
|
||||
action="version",
|
||||
version="%(prog)s {0}".format( control.version ),
|
||||
help="Show program's version number and exit"
|
||||
)
|
||||
|
||||
# Add stage arguments
|
||||
control.command_line_flags( parser )
|
||||
|
||||
# Process Arguments
|
||||
args = parser.parse_args()
|
||||
|
||||
# Utilize parsed arguments in each of the stages
|
||||
control.command_line_args( args )
|
||||
|
||||
|
||||
|
||||
### Main Entry Point ###
|
||||
|
||||
if __name__ == '__main__':
|
||||
# Initialize Control Stages
|
||||
control = stage.ControlStage()
|
||||
|
||||
# Process Command-Line Args
|
||||
command_line_args( control )
|
||||
|
||||
# Process Control Stages
|
||||
control.process()
|
||||
|
||||
# Successful Execution
|
||||
sys.exit( 0 )
|
||||
|
74
kll.py
74
kll.py
@ -41,12 +41,12 @@ from funcparserlib.parser import (some, a, many, oneplus, skip, finished, maybe,
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
|
||||
|
||||
## Python Text Formatting Fixer...
|
||||
## Because the creators of Python are averse to proper capitalization.
|
||||
## Python Text Formatting Fixer...
|
||||
## Because the creators of Python are averse to proper capitalization.
|
||||
textFormatter_lookup = {
|
||||
"usage: " : "Usage: ",
|
||||
"optional arguments" : "Optional Arguments",
|
||||
@ -181,13 +181,13 @@ def tokenize( string ):
|
||||
|
||||
### Parsing ###
|
||||
|
||||
## Map Arrays
|
||||
## Map Arrays
|
||||
macros_map = Macros()
|
||||
variables_dict = Variables()
|
||||
capabilities_dict = Capabilities()
|
||||
|
||||
|
||||
## Parsing Functions
|
||||
## Parsing Functions
|
||||
|
||||
class Make:
|
||||
def scanCode( token ):
|
||||
@ -333,7 +333,7 @@ class Make:
|
||||
def indCode_number( token ):
|
||||
return Make.hidCode_number( 'IndCode', token )
|
||||
|
||||
# Replace key-word with None specifier (which indicates a noneOut capability)
|
||||
# Replace key-word with None specifier (which indicates a noneOut capability)
|
||||
def none( token ):
|
||||
return [[[('NONE', 0)]]]
|
||||
|
||||
@ -436,7 +436,7 @@ class Make:
|
||||
print( value )
|
||||
return [ value[0] ]
|
||||
|
||||
# Range can go from high to low or low to high
|
||||
# Range can go from high to low or low to high
|
||||
def scanCode_range( rangeVals ):
|
||||
start = rangeVals[0]
|
||||
end = rangeVals[1]
|
||||
@ -448,9 +448,9 @@ class Make:
|
||||
# Iterate from start to end, and generate the range
|
||||
return list( range( start, end + 1 ) )
|
||||
|
||||
# Range can go from high to low or low to high
|
||||
# Warn on 0-9 for USBCodes (as this does not do what one would expect) TODO
|
||||
# Lookup USB HID tags and convert to a number
|
||||
# Range can go from high to low or low to high
|
||||
# Warn on 0-9 for USBCodes (as this does not do what one would expect) TODO
|
||||
# Lookup USB HID tags and convert to a number
|
||||
def hidCode_range( type, rangeVals ):
|
||||
# Check if already integers
|
||||
if isinstance( rangeVals[0], int ):
|
||||
@ -493,7 +493,7 @@ class Make:
|
||||
return ""
|
||||
|
||||
|
||||
## Base Rules
|
||||
## Base Rules
|
||||
|
||||
const = lambda x: lambda _: x
|
||||
unarg = lambda f: lambda x: f(*x)
|
||||
@ -512,7 +512,7 @@ def listElem( item ):
|
||||
def listToTuple( items ):
|
||||
return tuple( items )
|
||||
|
||||
# Flatten only the top layer (list of lists of ...)
|
||||
# Flatten only the top layer (list of lists of ...)
|
||||
def oneLayerFlatten( items ):
|
||||
mainList = []
|
||||
for sublist in items:
|
||||
@ -521,7 +521,7 @@ def oneLayerFlatten( items ):
|
||||
|
||||
return mainList
|
||||
|
||||
# Capability arguments may need to be expanded (e.g. 1 16 bit argument needs to be 2 8 bit arguments for the state machine)
|
||||
# Capability arguments may need to be expanded (e.g. 1 16 bit argument needs to be 2 8 bit arguments for the state machine)
|
||||
def capArgExpander( items ):
|
||||
newArgs = []
|
||||
# For each defined argument in the capability definition
|
||||
@ -536,8 +536,8 @@ def capArgExpander( items ):
|
||||
|
||||
return tuple( [ items[0], tuple( newArgs ) ] )
|
||||
|
||||
# Expand ranges of values in the 3rd dimension of the list, to a list of 2nd lists
|
||||
# i.e. [ sequence, [ combo, [ range ] ] ] --> [ [ sequence, [ combo ] ], <option 2>, <option 3> ]
|
||||
# Expand ranges of values in the 3rd dimension of the list, to a list of 2nd lists
|
||||
# i.e. [ sequence, [ combo, [ range ] ] ] --> [ [ sequence, [ combo ] ], <option 2>, <option 3> ]
|
||||
def optionExpansion( sequences ):
|
||||
expandedSequences = []
|
||||
|
||||
@ -612,7 +612,7 @@ def tupleit( t ):
|
||||
return tuple( map( tupleit, t ) ) if isinstance( t, ( tuple, list ) ) else t
|
||||
|
||||
|
||||
## Evaluation Rules
|
||||
## Evaluation Rules
|
||||
|
||||
class Eval:
|
||||
def scanCode( triggers, operator, results ):
|
||||
@ -746,7 +746,7 @@ class Set:
|
||||
variable = unarg( Eval.variable )
|
||||
|
||||
|
||||
## Sub Rules
|
||||
## Sub Rules
|
||||
|
||||
usbCode = tokenType('USBCode') >> Make.usbCode
|
||||
scanCode = tokenType('ScanCode') >> Make.scanCode
|
||||
@ -771,16 +771,16 @@ seqString = tokenType('SequenceString') >> Make.seqString
|
||||
unseqString = tokenType('SequenceString') >> Make.unseqString # For use with variables
|
||||
pixelOperator = tokenType('PixelOperator')
|
||||
|
||||
# Code variants
|
||||
# Code variants
|
||||
code_begin = tokenType('CodeBegin')
|
||||
code_end = tokenType('CodeEnd')
|
||||
|
||||
# Specifier
|
||||
# Specifier
|
||||
specifier_state = ( name + skip( operator(':') ) + timing ) | ( name + skip( operator(':') ) + timing ) | timing | name >> Make.specifierState
|
||||
specifier_analog = number >> Make.specifierAnalog
|
||||
specifier_list = skip( parenthesis('(') ) + many( ( specifier_state | specifier_analog ) + skip( maybe( comma ) ) ) + skip( parenthesis(')') )
|
||||
|
||||
# Scan Codes
|
||||
# Scan Codes
|
||||
scanCode_start = tokenType('ScanCodeStart')
|
||||
scanCode_range = number + skip( dash ) + number >> Make.scanCode_range
|
||||
scanCode_listElem = number >> listElem
|
||||
@ -791,7 +791,7 @@ scanCode_elem = scanCode + maybe( specifier_list ) >> Make.specifierUnroll
|
||||
scanCode_combo = oneplus( ( scanCode_expanded | scanCode_elem ) + skip( maybe( plus ) ) )
|
||||
scanCode_sequence = oneplus( scanCode_combo + skip( maybe( comma ) ) )
|
||||
|
||||
# Cons Codes
|
||||
# Cons Codes
|
||||
consCode_start = tokenType('ConsCodeStart')
|
||||
consCode_number = number >> Make.consCode_number
|
||||
consCode_range = ( consCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.consCode_range
|
||||
@ -802,7 +802,7 @@ consCode_innerList = oneplus( consCode_specifier + skip( maybe( comma ) ) ) >>
|
||||
consCode_expanded = skip( consCode_start ) + consCode_innerList + skip( code_end )
|
||||
consCode_elem = consCode + maybe( specifier_list ) >> Make.specifierUnroll >> listElem
|
||||
|
||||
# Sys Codes
|
||||
# Sys Codes
|
||||
sysCode_start = tokenType('SysCodeStart')
|
||||
sysCode_number = number >> Make.sysCode_number
|
||||
sysCode_range = ( sysCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.sysCode_range
|
||||
@ -813,7 +813,7 @@ sysCode_innerList = oneplus( sysCode_specifier + skip( maybe( comma ) ) ) >> f
|
||||
sysCode_expanded = skip( sysCode_start ) + sysCode_innerList + skip( code_end )
|
||||
sysCode_elem = sysCode + maybe( specifier_list ) >> Make.specifierUnroll >> listElem
|
||||
|
||||
# Indicator Codes
|
||||
# Indicator Codes
|
||||
indCode_start = tokenType('IndicatorStart')
|
||||
indCode_number = number >> Make.indCode_number
|
||||
indCode_range = ( indCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.indCode_range
|
||||
@ -824,7 +824,7 @@ indCode_innerList = oneplus( indCode_specifier + skip( maybe( comma ) ) ) >> f
|
||||
indCode_expanded = skip( indCode_start ) + indCode_innerList + skip( code_end )
|
||||
indCode_elem = indCode + maybe( specifier_list ) >> Make.specifierUnroll >> listElem
|
||||
|
||||
# USB Codes
|
||||
# USB Codes
|
||||
usbCode_start = tokenType('USBCodeStart')
|
||||
usbCode_number = number >> Make.usbCode_number
|
||||
usbCode_range = ( usbCode_number | unString ) + skip( dash ) + ( number | unString ) >> Make.usbCode_range
|
||||
@ -835,13 +835,13 @@ usbCode_innerList = oneplus( usbCode_specifier + skip( maybe( comma ) ) ) >> f
|
||||
usbCode_expanded = skip( usbCode_start ) + usbCode_innerList + skip( code_end )
|
||||
usbCode_elem = usbCode + maybe( specifier_list ) >> Make.specifierUnroll >> listElem
|
||||
|
||||
# HID Codes
|
||||
# HID Codes
|
||||
hidCode_elem = usbCode_expanded | usbCode_elem | sysCode_expanded | sysCode_elem | consCode_expanded | consCode_elem | indCode_expanded | indCode_elem
|
||||
|
||||
usbCode_combo = oneplus( hidCode_elem + skip( maybe( plus ) ) ) >> listElem
|
||||
usbCode_sequence = oneplus( ( usbCode_combo | seqString ) + skip( maybe( comma ) ) ) >> oneLayerFlatten
|
||||
|
||||
# Pixels
|
||||
# Pixels
|
||||
pixel_start = tokenType('PixelStart')
|
||||
pixel_number = number
|
||||
pixel_range = ( pixel_number ) + skip( dash ) + ( number ) >> Make.range
|
||||
@ -850,25 +850,25 @@ pixel_innerList = many( ( pixel_range | pixel_listElem ) + skip( maybe( comma
|
||||
pixel_expanded = skip( pixel_start ) + pixel_innerList + skip( code_end )
|
||||
pixel_elem = pixel >> listElem
|
||||
|
||||
# Pixel Layer
|
||||
# Pixel Layer
|
||||
pixellayer_start = tokenType('PixelLayerStart')
|
||||
pixellayer_number = number
|
||||
pixellayer_expanded = skip( pixellayer_start ) + pixellayer_number + skip( code_end )
|
||||
pixellayer_elem = pixelLayer >> listElem
|
||||
|
||||
# Pixel Channels
|
||||
# Pixel Channels
|
||||
pixelchan_chans = many( number + skip( operator(':') ) + number + skip( maybe( comma ) ) ) >> Make.pixelchans
|
||||
pixelchan_elem = ( pixel_expanded | pixel_elem ) + skip( parenthesis('(') ) + pixelchan_chans + skip( parenthesis(')') ) >> Make.pixelchan_elem
|
||||
|
||||
# Pixel Mods
|
||||
# Pixel Mods
|
||||
pixelmod_mods = many( maybe( pixelOperator | plus | dash ) + number + skip( maybe( comma ) ) ) >> Make.pixelmods
|
||||
pixelmod_layer = ( pixellayer_expanded | pixellayer_elem ) >> Make.pixellayer
|
||||
pixelmod_elem = ( pixel_expanded | pixel_elem | pixelmod_layer ) + skip( parenthesis('(') ) + pixelmod_mods + skip( parenthesis(')') )
|
||||
|
||||
# Pixel Capability
|
||||
# Pixel Capability
|
||||
pixel_capability = pixelmod_elem >> Make.pixelCapability
|
||||
|
||||
# Animations
|
||||
# Animations
|
||||
animation_start = tokenType('AnimationStart')
|
||||
animation_name = name
|
||||
animation_frame_range = ( number ) + skip( dash ) + ( number ) >> Make.range
|
||||
@ -877,28 +877,28 @@ animation_def = skip( animation_start ) + animation_name + skip( code_en
|
||||
animation_expanded = skip( animation_start ) + animation_name + skip( comma ) + animation_name_frame + skip( code_end )
|
||||
animation_elem = animation >> listElem
|
||||
|
||||
# Animation Modifier
|
||||
# Animation Modifier
|
||||
animation_modifier = many( ( name | number ) + maybe( skip( operator(':') ) + number ) + skip( maybe( comma ) ) )
|
||||
|
||||
# Animation Capability
|
||||
# Animation Capability
|
||||
animation_capability = ( animation_def | animation_elem ) + maybe( skip( parenthesis('(') + animation_modifier + skip( parenthesis(')') ) ) ) >> Make.animationCapability
|
||||
|
||||
# Capabilities
|
||||
# Capabilities
|
||||
capFunc_arguments = many( number + skip( maybe( comma ) ) ) >> listToTuple
|
||||
capFunc_elem = name + skip( parenthesis('(') ) + capFunc_arguments + skip( parenthesis(')') ) >> capArgExpander >> listElem
|
||||
capFunc_combo = oneplus( ( hidCode_elem | capFunc_elem | animation_capability | pixel_capability ) + skip( maybe( plus ) ) ) >> listElem
|
||||
capFunc_sequence = oneplus( ( capFunc_combo | seqString ) + skip( maybe( comma ) ) ) >> oneLayerFlatten
|
||||
|
||||
# Trigger / Result Codes
|
||||
# Trigger / Result Codes
|
||||
triggerCode_outerList = scanCode_sequence >> optionExpansion
|
||||
triggerUSBCode_outerList = usbCode_sequence >> optionExpansion >> hidCodeToCapability
|
||||
resultCode_outerList = ( ( capFunc_sequence >> optionExpansion ) | none ) >> hidCodeToCapability
|
||||
|
||||
# Positions
|
||||
# Positions
|
||||
position_list = oneplus( position + skip( maybe( comma ) ) )
|
||||
|
||||
|
||||
## Main Rules
|
||||
## Main Rules
|
||||
|
||||
#| Assignment
|
||||
#| <variable> = <variable contents>;
|
||||
|
@ -1,7 +1,7 @@
|
||||
#!/usr/bin/env python3
|
||||
# KLL Compiler Containers
|
||||
#
|
||||
# Copyright (C) 2014-2015 by Jacob Alexander
|
||||
# Copyright (C) 2014-2016 by Jacob Alexander
|
||||
#
|
||||
# This file is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
@ -24,7 +24,7 @@ import copy
|
||||
|
||||
### Decorators ###
|
||||
|
||||
## Print Decorator Variables
|
||||
## Print Decorator Variables
|
||||
ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
|
||||
|
||||
@ -32,7 +32,7 @@ ERROR = '\033[5;1;31mERROR\033[0m:'
|
||||
### Parsing ###
|
||||
|
||||
|
||||
## Containers
|
||||
## Containers
|
||||
|
||||
class ScanCode:
|
||||
# Container for ScanCodes
|
||||
|
@ -1,4 +1,4 @@
|
||||
/* Copyright (C) 2014-2015 by Jacob Alexander
|
||||
/* Copyright (C) 2014-2016 by Jacob Alexander
|
||||
*
|
||||
* This file is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU General Public License as published by
|
||||
|
@ -1,4 +1,4 @@
|
||||
/* Copyright (C) 2014-2015 by Jacob Alexander
|
||||
/* Copyright (C) 2014-2016 by Jacob Alexander
|
||||
*
|
||||
* This file is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU General Public License as published by
|
||||
|
Reference in New Issue
Block a user