Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SINE EXAMPLE ERROR #9

Open
JCMiles opened this issue Jul 23, 2019 · 0 comments
Open

SINE EXAMPLE ERROR #9

JCMiles opened this issue Jul 23, 2019 · 0 comments

Comments

@JCMiles
Copy link

JCMiles commented Jul 23, 2019

Hi,
I'm trying to run the sine_experiment.py experiment:

but I get stuck with this error:

Generating sine data into sine.csv
Generated 3000 rows of output data into sine.csv
Generating experiment files in directory: C:\Users\shock\Desktop\workspace\Allie...
Writing 314 lines...
Writing 114 lines...
done.
None
Successfully submitted new HyperSearch job, jobID=1005
Job 1005 failed with error: Fatal Python error: initfsencoding: unable to load the file system codec
File "C:\Users\shock\Anaconda3\envs\numenta\lib\encodings_init_.py", line 123
raise CodecRegistryError,
^
SyntaxError: invalid syntax

Current thread 0x00008a94 (most recent call first):

Evaluated 0 models
HyperSearch finished!
Job 1005 failed with error: Fatal Python error: initfsencoding: unable to load the file system codec
File "C:\Users\shock\Anaconda3\envs\numenta\lib\encodings_init_.py", line 123
raise CodecRegistryError,
^
SyntaxError: invalid syntax

Current thread 0x00008a94 (most recent call first):

Worker completion message: None

Results from all experiments:

2 2 2
Generating experiment files in directory: c:\users\shock\appdata\local\temp\tmprrlshn...
Writing 314 lines...
Writing 114 lines...
done.
None
_jobInfoNamedTuple(jobId=1005, client=u'GRP', clientInfo=u'', clientKey=u'', cmdLine=u'$HYPERSEARCH', params=u'{"hsVersion": "v2", "maxModels": null, "persistentJobGUID": "fe5848f0-ad6d-11e9-b7a7-3085a998ef7c", "useTerminators": false, "description": {"inferenceType": "TemporalAnomaly", "includedFields": [{"minValue": -1.0, "fieldName": "sine", "fieldType": "float", "maxValue": 1.0}], "inferenceArgs": {"predictionSteps": [1], "predictedField": "sine"}, "streamDef": {"info": "sine", "version": 1, "streams": [{"info": "sine.csv", "source": "file://sine.csv", "columns": [""]}]}, "swarmSize": "medium"}}', jobHash='\xfeXp\x00\xadm\x11\xe9\x89u0\x85\xa9\x98\xef|', status=u'notStarted', completionReason=None, completionMsg=None, workerCompletionReason=u'success', workerCompletionMsg=None, cancel=0, startTime=None, endTime=None, results=None, engJobType=u'hypersearch', minimumWorkers=1, maximumWorkers=8, priority=0, engAllocateNewWorkers=1, engUntendedDeadWorkers=0, numFailedWorkers=0, lastFailedWorkerErrorMsg=None, engCleaningStatus=u'notdone', genBaseDescription=None, genPermutations=None, engLastUpdateTime=datetime.datetime(2019, 7, 23, 17, 19, 14), engCjmConnId=None, engWorkerState=None, engStatus=None, engModelMilestones=None)
json.loads(jobInfo.results) raised an exception. Here is some info to help with debugging:
jobInfo: _jobInfoNamedTuple(jobId=1005, client=u'GRP', clientInfo=u'', clientKey=u'', cmdLine=u'$HYPERSEARCH', params=u'{"hsVersion": "v2", "maxModels": null, "persistentJobGUID": "fe5848f0-ad6d-11e9-b7a7-3085a998ef7c", "useTerminators": false, "description": {"inferenceType": "TemporalAnomaly", "includedFields": [{"minValue": -1.0, "fieldName": "sine", "fieldType": "float", "maxValue": 1.0}], "inferenceArgs": {"predictionSteps": [1], "predictedField": "sine"}, "streamDef": {"info": "sine", "version": 1, "streams": [{"info": "sine.csv", "source": "file://sine.csv", "columns": ["
"]}]}, "swarmSize": "medium"}}', jobHash='\xfeXp\x00\xadm\x11\xe9\x89u0\x85\xa9\x98\xef|', status=u'notStarted', completionReason=None, completionMsg=None, workerCompletionReason=u'success', workerCompletionMsg=None, cancel=0, startTime=None, endTime=None, results=None, engJobType=u'hypersearch', minimumWorkers=1, maximumWorkers=8, priority=0, engAllocateNewWorkers=1, engUntendedDeadWorkers=0, numFailedWorkers=0, lastFailedWorkerErrorMsg=None, engCleaningStatus=u'notdone', genBaseDescription=None, genPermutations=None, engLastUpdateTime=datetime.datetime(2019, 7, 23, 17, 19, 14), engCjmConnId=None, engWorkerState=None, engStatus=None, engModelMilestones=None)
jobInfo.results: None
EXCEPTION: expected string or buffer
Traceback (most recent call last):
File "C:\Users\shock\Desktop\workspace\NumentaTest\sine_experiment.py", line 85, in
run_sine_experiment()
File "C:\Users\shock\Desktop\workspace\NumentaTest\sine_experiment.py", line 57, in run_sine_experiment
model_params = swarm_over_data()
File "C:\Users\shock\Desktop\workspace\NumentaTest\sine_experiment.py", line 50, in swarm_over_data
SWARM_CONFIG, {'maxWorkers': 8, 'overwrite': True})
File "C:\Users\shock\Anaconda3\envs\numenta\lib\site-packages\nupic\swarming\permutations_runner.py", line 271, in runWithConfig
return _runAction(runOptions)
File "C:\Users\shock\Anaconda3\envs\numenta\lib\site-packages\nupic\swarming\permutations_runner.py", line 212, in _runAction
returnValue = _runHyperSearch(runOptions)
File "C:\Users\shock\Anaconda3\envs\numenta\lib\site-packages\nupic\swarming\permutations_runner.py", line 155, in runHyperSearch
metricsKeys=search.getDiscoveredMetricsKeys())
File "C:\Users\shock\Anaconda3\envs\numenta\lib\site-packages\nupic\swarming\permutations_runner.py", line 825, in generateReport
results = json.loads(jobInfo.results)
File "C:\Users\shock\Anaconda3\envs\numenta\lib\json_init
.py", line 339, in loads
return _default_decoder.decode(s)
File "C:\Users\shock\Anaconda3\envs\numenta\lib\json\decoder.py", line 364, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
TypeError: expected string or buffer


Seems liek jobInfo.results is None and that crashes when:
results = json.loads(jobInfo.results)

I also tried with a personal dataset and got the same error
Any idea on what's the cause and how to fix it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant