instance_id stringlengths 10 57 | file_changes listlengths 1 15 | repo stringlengths 7 53 | base_commit stringlengths 40 40 | problem_statement stringlengths 11 52.5k | patch stringlengths 251 7.06M |
|---|---|---|---|---|---|
0b01001001__spectree-64 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "setup.py"
},
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"spectree/utils.py:parse_par... | 0b01001001/spectree | a091fab020ac26548250c907bae0855273a98778 | [BUG]description for query paramters can not show in swagger ui
Hi, when I add a description for a schema used in query, it can not show in swagger ui but can show in Redoc
```py
@HELLO.route('/', methods=['GET'])
@api.validate(query=HelloForm)
def hello():
"""
hello 注释
:return:
"""
return '... | diff --git a/setup.py b/setup.py
index 1b3cb64..4ef21e6 100644
--- a/setup.py
+++ b/setup.py
@@ -14,7 +14,7 @@ with open(path.join(here, 'requirements.txt'), encoding='utf-8') as f:
setup(
name='spectree',
- version='0.3.7',
+ version='0.3.8',
author='Keming Yang',
author_email='kemingy94@gmail.... |
12rambau__sepal_ui-411 | [
{
"changes": {
"added_entities": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker.disable"
],
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker"
]
},
"file": "sepal_ui/sepalwidgets/inputs.py"
... | 12rambau/sepal_ui | 179bd8d089275c54e94a7614be7ed03d298ef532 | add a disabled trait on the datepicker
I'm currently coding it in a module and the process of disabling a datepicker is uterly boring. I think we could add an extra trait to the layout and pilot the enabling and disabling directly from the built-in widget
```python
self.w_start = sw.DatePicker(label="start", v_mo... | diff --git a/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst b/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
index 1a982afb..867227cb 100644
--- a/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
+++ b/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
@@ -8,6 +8,7 @@ sepal\_ui.sepalwi... |
12rambau__sepal_ui-416 | [
{
"changes": {
"added_entities": [
"sepal_ui/sepalwidgets/app.py:DrawerItem.add_notif",
"sepal_ui/sepalwidgets/app.py:DrawerItem.remove_notif"
],
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/app.py:DrawerItem.__init__",
"sepal_ui/sepa... | 12rambau/sepal_ui | 8b76805db051d6d15024bd9ec2d78502cd92132e | Interact with navigation drawers
Sometimes is useful to pass some data from the module model to the app environment and so far we do not have this implementation.
We can add two simple methods to the drawers so they can update their state with icons, badges, and so.
| diff --git a/docs/source/modules/sepal_ui.sepalwidgets.DrawerItem.rst b/docs/source/modules/sepal_ui.sepalwidgets.DrawerItem.rst
index a3280cd3..22b87b44 100644
--- a/docs/source/modules/sepal_ui.sepalwidgets.DrawerItem.rst
+++ b/docs/source/modules/sepal_ui.sepalwidgets.DrawerItem.rst
@@ -7,7 +7,9 @@ sepal\_ui.sepalwi... |
12rambau__sepal_ui-418 | [
{
"changes": {
"added_entities": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker.check_date",
"sepal_ui/sepalwidgets/inputs.py:DatePicker.is_valid_date"
],
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker.__init__"
],
... | 12rambau/sepal_ui | 8b76805db051d6d15024bd9ec2d78502cd92132e | Can't instantiate a sw.DatePicker with initial v_model
Is not possible to instantiate the sepal DatePicker with an initially given date through the `v_model` parameter | diff --git a/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst b/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
index 867227cb..322cca23 100644
--- a/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
+++ b/docs/source/modules/sepal_ui.sepalwidgets.DatePicker.rst
@@ -9,6 +9,7 @@ sepal\_ui.sepalwi... |
12rambau__sepal_ui-459 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/app.py:NavDrawer.__init__"
],
"edited_modules": [
"sepal_ui/sepalwidgets/app.py:NavDrawer"
]
},
"file": "sepal_ui/sepalwidgets/app.py"
},
{
... | 12rambau/sepal_ui | a4b3091755a11ef31a3714858007a93b750b6a79 | crowdin untranslated keys are marked as empty string
These string are interpreted as "something" by the translator leading to empty strings everywhere in the build-in component.
They should be ignored | diff --git a/docs/source/modules/sepal_ui.translator.Translator.rst b/docs/source/modules/sepal_ui.translator.Translator.rst
index 60fa976c..642a3ab6 100644
--- a/docs/source/modules/sepal_ui.translator.Translator.rst
+++ b/docs/source/modules/sepal_ui.translator.Translator.rst
@@ -27,6 +27,7 @@ sepal\_ui.translator.Tr... |
12rambau__sepal_ui-501 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/app.py:LocaleSelect.__init__"
],
"edited_modules": [
"sepal_ui/sepalwidgets/app.py:LocaleSelect"
]
},
"file": "sepal_ui/sepalwidgets/app.py"
},
... | 12rambau/sepal_ui | 7eb3f48735e1cfeac75fecf88dd8194c8daea3d3 | use box for the translator ?
I discovered this lib while working on the geemap drop.
I think it could be super handy for the translator keys and maybe faster. https://github.com/cdgriffith/Box
side note: we will need it anyway for the geemap drop | diff --git a/docs/source/modules/sepal_ui.translator.Translator.rst b/docs/source/modules/sepal_ui.translator.Translator.rst
index 642a3ab6..7f11e39f 100644
--- a/docs/source/modules/sepal_ui.translator.Translator.rst
+++ b/docs/source/modules/sepal_ui.translator.Translator.rst
@@ -2,19 +2,6 @@ sepal\_ui.translator.Tra... |
12rambau__sepal_ui-516 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"sepal_ui/aoi/aoi_model.py:AoiModel"
]
},
"file": "sepal_ui/aoi/aoi_model.py"
},
{
"changes": {
"added_entities": null,
"added_modules": nu... | 12rambau/sepal_ui | 9c319b0c21b8b1ba75173f3f85fd184747c398de | deprecate zip_dir
https://github.com/12rambau/sepal_ui/blob/a9255e7c566aac31ee7f8303e74fb7e8a3d57e5f/sepal_ui/aoi/aoi_model.py#L64
This folder is created on AOI call but is not used anymore as we are using the tmp module to create the tmp directory. | diff --git a/docs/source/modules/sepal_ui.aoi.AoiModel.rst b/docs/source/modules/sepal_ui.aoi.AoiModel.rst
index 0f5b8f1a..ccdcab52 100644
--- a/docs/source/modules/sepal_ui.aoi.AoiModel.rst
+++ b/docs/source/modules/sepal_ui.aoi.AoiModel.rst
@@ -12,7 +12,6 @@ sepal\_ui.aoi.AoiModel
~AoiModel.NAME
~AoiM... |
12rambau__sepal_ui-518 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/mapping/aoi_control.py:AoiControl.__init__"
],
"edited_modules": [
"sepal_ui/mapping/aoi_control.py:AoiControl"
]
},
"file": "sepal_ui/mapping/aoi_control.py"
... | 12rambau/sepal_ui | 698d446e33062934d49f9edb91cbe303b73e786f | add posibility to add text in the map_btn
The current implementation of the map_btn only authorize to use logos. It would be nice to let the opportunity to use letters as in the SEPAL main framework (3 letters only in capital) | diff --git a/sepal_ui/mapping/aoi_control.py b/sepal_ui/mapping/aoi_control.py
index 01a6aa48..ae143d2c 100644
--- a/sepal_ui/mapping/aoi_control.py
+++ b/sepal_ui/mapping/aoi_control.py
@@ -36,7 +36,7 @@ class AoiControl(WidgetControl):
kwargs["position"] = kwargs.pop("position", "topright")
# crea... |
12rambau__sepal_ui-535 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/mapping/sepal_map.py:SepalMap.zoom_bounds"
],
"edited_modules": [
"sepal_ui/mapping/sepal_map.py:SepalMap"
]
},
"file": "sepal_ui/mapping/sepal_map.py"
},
... | 12rambau/sepal_ui | 6a619361e90ab318463e2094fc9dbcbc85dd2e8f | create a translator function to check the use of the keys
If you are updating many time the same application you may end up removing some or all the existing keys. It complex to visually assess if all the remaining keys in the dict are used.
Maybe a parser could be interesting to check all the folder files and vali... | diff --git a/sepal_ui/mapping/sepal_map.py b/sepal_ui/mapping/sepal_map.py
index 57693e56..e2860daf 100644
--- a/sepal_ui/mapping/sepal_map.py
+++ b/sepal_ui/mapping/sepal_map.py
@@ -227,8 +227,8 @@ class SepalMap(ipl.Map):
# Center map to the centroid of the layer(s)
self.center = [(maxy - miny) / 2 ... |
12rambau__sepal_ui-574 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/translator/translator.py:Translator.__init__",
"sepal_ui/translator/translator.py:Translator.search_key"
],
"edited_modules": [
"sepal_ui/translator/translator.py:Tr... | 12rambau/sepal_ui | 412e02ef08df68c256f384081d2c7eaecc09428e | _protected_keys are not raising error when used in translator
`protected_keys` are not raising errors when used in a json translation file. It is also happening with the "`FORBIDDEN_KEYS`" when are used in nested levels.
To reproduce...
```Python
# set up the appropriate keys for each language
keys = {
"en... | diff --git a/sepal_ui/translator/translator.py b/sepal_ui/translator/translator.py
index 1ad14c98..ea647223 100644
--- a/sepal_ui/translator/translator.py
+++ b/sepal_ui/translator/translator.py
@@ -65,7 +65,7 @@ class Translator(Box):
# check if forbidden keys are being used
# this will raise an er... |
12rambau__sepal_ui-601 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker.__init__",
"sepal_ui/sepalwidgets/inputs.py:DatePicker.check_date"
],
"edited_modules": [
"sepal_ui/sepalwidgets/inputs.py:DatePick... | 12rambau/sepal_ui | 89f8d87dc4f83bfc2e96a111692ae252e470e8bc | Datepicker is not fully customizable
As our main `DatePicker` usage is as in its "menu" form, it is not handy to set some use cases:
- set a min_, max_ value directly (you have to `datepicker.children.....min_`...)
- set a default initial value with `v_model` since it is hardcoded from the beginning
- the `jslink` "... | diff --git a/sepal_ui/sepalwidgets/inputs.py b/sepal_ui/sepalwidgets/inputs.py
index 95fda88a..6293f828 100644
--- a/sepal_ui/sepalwidgets/inputs.py
+++ b/sepal_ui/sepalwidgets/inputs.py
@@ -6,6 +6,7 @@ import ee
import geopandas as gpd
import ipyvuetify as v
import pandas as pd
+from deprecated.sphinx import versio... |
12rambau__sepal_ui-608 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/reclassify/reclassify_view.py:ImportMatrixDialog.__init__",
"sepal_ui/reclassify/reclassify_view.py:SaveMatrixDialog.__init__",
"sepal_ui/reclassify/reclassify_view.py:Reclassif... | 12rambau/sepal_ui | 2d5126f5e9521470cbeb5ad374f74046e889f771 | create a function to set the text of the btn dynamically
icon and text should be editable dynamically
https://github.com/12rambau/sepal_ui/blob/8af255ec0d1cb3ad4dd74d021ad140fafef756f6/sepal_ui/sepalwidgets/btn.py#L38 | diff --git a/docs/source/widgets/btn.rst b/docs/source/widgets/btn.rst
index 949d5468..91d92967 100644
--- a/docs/source/widgets/btn.rst
+++ b/docs/source/widgets/btn.rst
@@ -20,8 +20,8 @@ The default color is set to "primary".
v.theme.dark = False
btn = sw.Btn(
- text = "The One btn",
- i... |
12rambau__sepal_ui-644 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/btn.py:Btn.__init__",
"sepal_ui/sepalwidgets/btn.py:Btn._set_text"
],
"edited_modules": [
"sepal_ui/sepalwidgets/btn.py:Btn"
]
},
"file": ... | 12rambau/sepal_ui | 8a8196e3c7893b7a0aebdb4910e83054f59e0374 | sepal_ui.Btn does't work as expected
I want to create a simple Icon button, to do so:
```python
sw.Btn(icon=True, gliph ="mdi-plus")
```
Doing this, without "msg" parameter will add the default text to the button which is "click", I think is worthless having that value.
So if I want to remove the default text... | diff --git a/sepal_ui/sepalwidgets/btn.py b/sepal_ui/sepalwidgets/btn.py
index 137622fa..105f6160 100644
--- a/sepal_ui/sepalwidgets/btn.py
+++ b/sepal_ui/sepalwidgets/btn.py
@@ -25,6 +25,9 @@ class Btn(v.Btn, SepalWidget):
.. deprecated:: 2.13
``text`` and ``icon`` will be replaced by ``msg`` and ``gli... |
12rambau__sepal_ui-646 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/alert.py:Alert.update_progress"
],
"edited_modules": [
"sepal_ui/sepalwidgets/alert.py:Alert"
]
},
"file": "sepal_ui/sepalwidgets/alert.py"
}
] | 12rambau/sepal_ui | 8a8196e3c7893b7a0aebdb4910e83054f59e0374 | allow other values for progress
Now that we are supporting tqdm it should be possible to support progress values that are not between 0 and 1. https://github.com/12rambau/sepal_ui/blob/c15a83dc6c92d076e6932afab4e4b2987585894b/sepal_ui/sepalwidgets/alert.py#L98 | diff --git a/sepal_ui/sepalwidgets/alert.py b/sepal_ui/sepalwidgets/alert.py
index 68e3f115..de6d4abb 100644
--- a/sepal_ui/sepalwidgets/alert.py
+++ b/sepal_ui/sepalwidgets/alert.py
@@ -94,9 +94,10 @@ class Alert(v.Alert, SepalWidget):
self.show()
# cast the progress to float
+ total = tqdm_... |
12rambau__sepal_ui-747 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/sepalwidget.py:SepalWidget.get_children"
],
"edited_modules": [
"sepal_ui/sepalwidgets/sepalwidget.py:SepalWidget"
]
},
"file": "sepal_ui/sepalwid... | 12rambau/sepal_ui | a683a7665a9710acd5ca939308e18539e92014b7 | make get_children recursively again
previous implementation used recursion to find all children within the widget that matches with the query, now it returns only first level of matching children, could we make it reclusively again? | diff --git a/sepal_ui/sepalwidgets/sepalwidget.py b/sepal_ui/sepalwidgets/sepalwidget.py
index 40826809..00cbe015 100644
--- a/sepal_ui/sepalwidgets/sepalwidget.py
+++ b/sepal_ui/sepalwidgets/sepalwidget.py
@@ -177,11 +177,11 @@ class SepalWidget(v.VuetifyWidget):
is_klass = isinstance(w, klass)
... |
12rambau__sepal_ui-758 | [
{
"changes": {
"added_entities": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker.today"
],
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"sepal_ui/sepalwidgets/inputs.py:DatePicker"
]
},
"file": "sepal_ui/sepalwidgets/inputs.py"
}... | 12rambau/sepal_ui | 27a18eba37bec8ef1cabfa6bcc4022164ebc4c3b | add a today() method for the datepicker
It's something I do a lot, setting up the datepicker to today as:
```python
from sepal_ui import sepawidgets as sw
from datetime import datetime
dp = sw.Datepicker()
# do stulff and as a fallback do
dp.v_model = datetime.today().strftime("%Y-%m-%d")
```
In... | diff --git a/sepal_ui/sepalwidgets/inputs.py b/sepal_ui/sepalwidgets/inputs.py
index 04e69553..0cb7a9bf 100644
--- a/sepal_ui/sepalwidgets/inputs.py
+++ b/sepal_ui/sepalwidgets/inputs.py
@@ -156,6 +156,12 @@ class DatePicker(v.Layout, SepalWidget):
return
+ def today(self) -> Self:
+ """Update th... |
12rambau__sepal_ui-774 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/inputs.py:FileInput.__init__",
"sepal_ui/sepalwidgets/inputs.py:FileInput._get_items"
],
"edited_modules": [
"sepal_ui/sepalwidgets/inputs.py:FileInput"... | 12rambau/sepal_ui | 2576446debe3544f3edeb208c76f671ffc0c8650 | Restrict maximum parent level from InputFile
I have some apps where I’m interested on only search up to certain level, i.e., module_downloads, and I think that in the most of them, the user doesn’t need to go upper from sepal_user, once they start clicking, they could easily get lost over multiple folders.
what if w... | diff --git a/sepal_ui/sepalwidgets/inputs.py b/sepal_ui/sepalwidgets/inputs.py
index 04e69553..cd561bb5 100644
--- a/sepal_ui/sepalwidgets/inputs.py
+++ b/sepal_ui/sepalwidgets/inputs.py
@@ -205,6 +205,9 @@ class FileInput(v.Flex, SepalWidget):
clear: Optional[v.Btn] = None
"clear btn to remove everything and... |
12rambau__sepal_ui-814 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/sepalwidgets/alert.py:Alert.update_progress"
],
"edited_modules": [
"sepal_ui/sepalwidgets/alert.py:Alert"
]
},
"file": "sepal_ui/sepalwidgets/alert.py"
}
] | 12rambau/sepal_ui | 6d825ae167f96ad2e7b76b96ca07de562f74dcf0 | avoid to force developer to set total each time
I should be able to init the progress of an Alert first and then simply update the progress.
as in:
```python
from sepal_ui import sepalwidgets as sw
alert = sw.Alert()
# init
alert.update_progress(0, "toto", total=10)
# loop
for i in range(10):
... | diff --git a/sepal_ui/sepalwidgets/alert.py b/sepal_ui/sepalwidgets/alert.py
index 19718f51..8dafab92 100644
--- a/sepal_ui/sepalwidgets/alert.py
+++ b/sepal_ui/sepalwidgets/alert.py
@@ -108,14 +108,17 @@ class Alert(v.Alert, SepalWidget):
Args:
progress: the progress status in float
... |
12rambau__sepal_ui-896 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"sepal_ui/planetapi/planet_model.py:PlanetModel.init_session",
"sepal_ui/planetapi/planet_model.py:PlanetModel.get_mosaics",
"sepal_ui/planetapi/planet_model.py:PlanetModel.get_quad"
... | 12rambau/sepal_ui | b91b2a2c45b4fa80a7a0c699df978ebc46682260 | get_mosaics from planet api fails
`get_mosaics` --and probably-- `get_quads` fails when the authentication process is done `from_login`... test has been passed because we are only testing the initialization of the `Planet` `from_key` but not to get elements from it | diff --git a/sepal_ui/message/en/locale.json b/sepal_ui/message/en/locale.json
index abdea912..b65a232d 100644
--- a/sepal_ui/message/en/locale.json
+++ b/sepal_ui/message/en/locale.json
@@ -85,14 +85,17 @@
"exception": {
"empty": "Please fill the required field(s).",
"invalid": "Invalid email or pas... |
15five__scim2-filter-parser-13 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "setup.py"
},
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/scim2_filter_parser/tra... | 15five/scim2-filter-parser | 3ed1858b492542d0bc9b9e9ab9547641595e28c1 | Return NamedTuple rather than tuple.
It would be nice to return a NamedTuple instead of a tuple here:
https://github.com/15five/scim2-filter-parser/blob/7ddc216f8c3dd1cdb2152944187e8f7f5ee07be2/src/scim2_filter_parser/transpilers/sql.py#L148
This way parts of each path could be accessed by name rather than by in... | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 12a5d4f..178f172 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,6 +1,10 @@
CHANGE LOG
==========
+0.3.5
+-----
+- Update the sql.Transpiler to collect namedtuples rather than tuples for attr paths
+
0.3.4
-----
- Update tox.ini and clean up linting errors
di... |
15five__scim2-filter-parser-20 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"src/scim2_filter_parser/parser.py:SCIMParser"
]
},
"file": "src/scim2_filter_parser/parser.py"
}
] | 15five/scim2-filter-parser | 08de23c5626556a37beced764a22a2fa7021989b | Issue when using multiple "or" or "and"
Hi,
I am facing an issue, where the query having two or more "and" or more than two "or" is failing.
Have a look at examples below: -
1)```"displayName co \"username\" or nickName co \"username\" or userName co \"username\""```
```"displayName co \"username\" and nick... | diff --git a/src/scim2_filter_parser/parser.py b/src/scim2_filter_parser/parser.py
index 516f65d..12c693e 100644
--- a/src/scim2_filter_parser/parser.py
+++ b/src/scim2_filter_parser/parser.py
@@ -110,9 +110,8 @@ class SCIMParser(Parser):
# which takes precedence over "or"
# 3. Attribute operators
... |
15five__scim2-filter-parser-31 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "setup.py"
},
{
"changes": {
"added_entities": [
"src/scim2_filter_parser/ast.py:AttrPath.case_insensitive",
"src/scim2_filter_pa... | 15five/scim2-filter-parser | c794bf3e50e3cb71bdcf919feb43d11912907dd2 | userName attribute should be case-insensitive, per the RFC
From https://github.com/15five/django-scim2/issues/76
> See https://datatracker.ietf.org/doc/html/rfc7643#section-4.1.1: (userName)
> This attribute is REQUIRED and is case insensitive.
> Currently this case-insensitive behavior is not implemented and... | diff --git a/CHANGELOG.rst b/CHANGELOG.rst
index 14f28e6..35eb5c5 100644
--- a/CHANGELOG.rst
+++ b/CHANGELOG.rst
@@ -1,5 +1,12 @@
CHANGE LOG
==========
+0.4.0
+-----
+- Update userName to be case insensitive. #31
+
+BREAKING CHANGE: This allows queries that did not match rows before to
+match rows now!
+
0.3.9
... |
20c__ctl-3 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/ctl/plugins/pypi.py:PyPIPlugin.dist_path",
"src/ctl/plugins/pypi.py:PyPIPlugin.prepare"
],
"edited_modules": [
"src/ctl/plugins/pypi.py:PyPIPluginConfig",
"src/ct... | 20c/ctl | 879af37647e61767a1ede59ffd353e4cfd27cd6f | PyPI plugin: `target` config attribute should be `repository`
This is so it's in line with the version plugin, which currently uses `repository` to specify the target repository
The pypi plugin currently uses `repository` to specify which PyPI repository to use, this should change to `pypi_repository` as well.
Sh... | diff --git a/src/ctl/plugins/pypi.py b/src/ctl/plugins/pypi.py
index 5d979af..a6117af 100644
--- a/src/ctl/plugins/pypi.py
+++ b/src/ctl/plugins/pypi.py
@@ -32,7 +32,7 @@ class PyPIPluginConfig(release.ReleasePluginConfig):
config_file = confu.schema.Str(help="path to pypi config file (e.g. ~/.pypirc)")
# P... |
2gis__k8s-handle-120 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"k8s_handle/templating.py:Renderer._evaluate_tags"
],
"edited_modules": [
"k8s_handle/templating.py:Renderer"
]
},
"file": "k8s_handle/templating.py"
}
] | 2gis/k8s-handle | 0ce48ecc5cd78eac5894241468a53080c3ccec64 | skip-tags does not work
Hello,
it seems `--skip-tags` does not work. Steps to reproduce:
```
git clone git@github.com:2gis/k8s-handle-example.git
```
Edit config.yaml, add some tag
```
staging:
templates:
- template: configmap.yaml.j2
- template: deployment.yaml.j2
- template: service.yaml.j2
ta... | diff --git a/k8s_handle/templating.py b/k8s_handle/templating.py
index 7f2d6b7..445a09e 100644
--- a/k8s_handle/templating.py
+++ b/k8s_handle/templating.py
@@ -195,7 +195,10 @@ class Renderer:
@staticmethod
def _evaluate_tags(tags, only_tags, skip_tags):
- if only_tags is None and skip_tags is None:... |
2gis__k8s-handle-73 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"k8s_handle/config.py:_process_variable"
],
"edited_modules": [
"k8s_handle/config.py:_process_variable"
]
},
"file": "k8s_handle/config.py"
},
{
"changes": {
... | 2gis/k8s-handle | dec5c73ec1bcd694bd45651901d68cd933721b3e | It is not possible to concatenate several environment variables into one value
Привет.
Столкнулся с невозможностью одновременного использования нескольких переменных окружения при объявлении переменной в config.yaml. Небольшой пример:
Мы при развертывании сервиса иногда создаем более одного деплоя, registry используе... | diff --git a/.dockerignore b/.dockerignore
index d3f33a2..fea1e67 100644
--- a/.dockerignore
+++ b/.dockerignore
@@ -6,5 +6,10 @@ Dockerfile*
.gitignore
.idea/
.tox/
+.travis.yml
+tox.ini
__pychache__
htmlcov/
+tests/
+*.png
+
diff --git a/k8s_handle/config.py b/k8s_handle/config.py
index acbf34f..6c2564f 100644
-... |
2gis__k8s-handle-84 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"k8s_handle/config.py:load_context_section"
],
"edited_modules": [
"k8s_handle/config.py:load_context_section"
]
},
"file": "k8s_handle/config.py"
}
] | 2gis/k8s-handle | 92f764f44301bcd406d588a4db5cf0333fc1ccc2 | Empty section passes validation
Empty section string (-s "") passes validation and causes not wrapped KeyError. | diff --git a/k8s_handle/config.py b/k8s_handle/config.py
index df08de4..17cdb33 100644
--- a/k8s_handle/config.py
+++ b/k8s_handle/config.py
@@ -156,6 +156,9 @@ def _update_context_recursively(context, include_history=[]):
def load_context_section(section):
+ if not section:
+ raise RuntimeError('Empty s... |
3YOURMIND__django-migration-linter-113 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"django_migration_linter/migration_linter.py:MigrationLinter.read_migrations_list",
"django_migration_linter/migration_linter.py:MigrationLinter._gather_migrations_git",
"django_migration... | 3YOURMIND/django-migration-linter | 799957a5564e8ca1ea20d7cf643abbc21db4e40f | Bug: --include-migrations-from argument being ignored
In version 2.2.2, using the `--include-migration-from` argument and specifying a migration .py file will not work and `lintmigrations` will run on all migration files.
On [line 299](https://github.com/3YOURMIND/django-migration-linter/blob/799957a5564e8ca1ea20d7c... | diff --git a/django_migration_linter/migration_linter.py b/django_migration_linter/migration_linter.py
index 31f8fea..c5ea333 100644
--- a/django_migration_linter/migration_linter.py
+++ b/django_migration_linter/migration_linter.py
@@ -289,8 +289,13 @@ class MigrationLinter(object):
@classmethod
def read_m... |
3YOURMIND__django-migration-linter-156 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"django_migration_linter/migration_linter.py:MigrationLinter.lint_migration",
"django_migration_linter/migration_linter.py:MigrationLinter.lint_runsql"
],
"edited_modules": [
... | 3YOURMIND/django-migration-linter | a119b4ba1fdfd27bf950e109771c6fd3e41d48dc | Raise backend specific deployment implications
For instance, certain operations will potentially lock a table, which can have implications during deployment (lots of operations require the table => we don't acquire the lock waiting to migrate / one we have the lock, long migration, locking the table and making producti... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 75223f0..1ee0f0b 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,6 +4,9 @@
* the positional argument `GIT_COMMIT_ID` becomes an optional argument with the named parameter ` --git-commit-id [GIT_COMMIT_ID]`
* the `lintmigrations` command takes now two positional argume... |
3YOURMIND__django-migration-linter-186 | [
{
"changes": {
"added_entities": [
"django_migration_linter/sql_analyser/postgresql.py:has_create_index"
],
"added_modules": [
"django_migration_linter/sql_analyser/postgresql.py:has_create_index"
],
"edited_entities": null,
"edited_modules": [
"django... | 3YOURMIND/django-migration-linter | aef3db3e4198d06c38bc4b0874e72ed657891eea | Linter fails on CREATE INDEX when creating a new table
Here is an example `CreateModel` from Django:
```python
migrations.CreateModel(
name='ShipmentMetadataAlert',
fields=[
('deleted_at', models.DateTimeField(blank=True, db_index=True, null=True)),
('created_at', common.fields.CreatedFi... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index d1ec8e5..15fefc0 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,7 +1,8 @@
-## 4.0.0
+## 4.0.0 (unreleased)
- Drop support for Python 2.7 and 3.5
- Drop support for Django 1.11, 2.0, 2.1, 3.0
+- Fix index creation detection when table is being created in the transac... |
3YOURMIND__django-migration-linter-222 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"django_migration_linter/migration_linter.py:MigrationLinter.get_runpython_model_import_issues"
],
"edited_modules": [
"django_migration_linter/migration_linter.py:MigrationLinter"
... | 3YOURMIND/django-migration-linter | 3baf9487bde6ae27c3ba7623a410ab6c39bb0584 | Linter failing when using django 'through'
### through doc
https://docs.djangoproject.com/en/4.0/ref/models/fields/#django.db.models.ManyToManyField.through
### Example code
```
def forwards_func(apps, schema_editor):
Question = apps.get_model("solution", "Question")
...
Question.many_to_may.throug... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index a1b5213..300fe00 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,3 +1,7 @@
+## 4.1.1 (unreleased)
+
+- Fixed `RunPython` model import check when using a `through` object like `MyModel.many_to_many.through.objects.filter(...)` (issue #218)
+
## 4.1.0
- Allow configur... |
3YOURMIND__django-migration-linter-258 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"django_migration_linter/sql_analyser/base.py:has_not_null_column"
],
"edited_modules": [
"django_migration_linter/sql_analyser/base.py:has_not_null_column"
]
},
"file":... | 3YOURMIND/django-migration-linter | 366d16b01a72d0baa54fef55761d846b0f05b8dd | Adding an index with a NOT NULL condition incorrectly triggers NOT_NULL rule
Adding an index with a `WHERE` clause including `NOT NULL` gets flagged as a `NOT NULL constraint on columns` error.
## Steps to reproduce
The follow migration operation:
```python
AddIndexConcurrently(
model_name="prediction",
... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 3069d91..beafd65 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -4,10 +4,21 @@
Instead, the linter crashes and lets the `sqlmigrate` error raise, in order to avoid letting a problematic migration pass.
One common reason for such an error is the SQL generation which requ... |
3YOURMIND__django-migration-linter-47 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"django_migration_linter/migration_linter.py:_main"
],
"edited_modules": [
"django_migration_linter/migration_linter.py:_main"
]
},
"file": "django_migration_linter/migr... | 3YOURMIND/django-migration-linter | fbf0f4419336fcb1235fa57f5575ad2593354e44 | Add --version option
Pretty straightforward. Have a `--version` that prints the current version of the linter. | diff --git a/django_migration_linter/migration_linter.py b/django_migration_linter/migration_linter.py
index f9c0ab1..03c2054 100644
--- a/django_migration_linter/migration_linter.py
+++ b/django_migration_linter/migration_linter.py
@@ -20,7 +20,7 @@ from subprocess import Popen, PIPE
import sys
from .cache import ... |
42DIGITAL__bqtools-11 | [
{
"changes": {
"added_entities": [
"fourtytwo/bqtools/__init__.py:BQTable.__repr__"
],
"added_modules": null,
"edited_entities": [
"fourtytwo/bqtools/__init__.py:BQTable.__eq__",
"fourtytwo/bqtools/__init__.py:BQTable._set_schema",
"fourtytwo/bqtools/__ini... | 42DIGITAL/bqtools | 98ce0de1d976f33cf04217ef50f864f74bd5ed52 | append data to empty table.data fails - required for schema_only
```python
>>> from fourtytwo import bqtools
>>> table = bqtools.read_bq(table_ref='project.dataset.table', schema_only=True)
>>> table.append([])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/bqtools/fourtytwo/bqt... | diff --git a/fourtytwo/bqtools/__init__.py b/fourtytwo/bqtools/__init__.py
index 9542242..136e18c 100644
--- a/fourtytwo/bqtools/__init__.py
+++ b/fourtytwo/bqtools/__init__.py
@@ -104,14 +104,21 @@ class BQTable(object):
def __init__(self, schema=None, data=None):
if DEBUG:
logging.debug('bq... |
4Catalyzer__flask-resty-248 | [
{
"changes": {
"added_entities": [
"flask_resty/compat.py:_strict_run",
"flask_resty/compat.py:schema_load",
"flask_resty/compat.py:schema_dump"
],
"added_modules": [
"flask_resty/compat.py:_strict_run",
"flask_resty/compat.py:schema_load",
"flas... | 4Catalyzer/flask-resty | ac43163453fab1b23434d29f71a3c1b34b251c0a | Support Marshmallow 3
[From here](https://marshmallow.readthedocs.io/en/latest/upgrading.html#schemas-are-always-strict):
> Schema().load and Schema().dump don’t return a (data, errors) tuple any more. Only data is returned. | diff --git a/.travis.yml b/.travis.yml
index 708e4a9..00ffb59 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -4,16 +4,21 @@ services:
- postgresql
matrix:
include:
- - { python: "2.7", env: "TOXENV=py-full DATABASE_URL=postgres://localhost/travis_ci_test" }
- - { python: "3.5", env: "TOXENV=py-full DATA... |
4degrees__clique-26 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"source/clique/collection.py:Collection.format"
],
"edited_modules": [
"source/clique/collection.py:Collection"
]
},
"file": "source/clique/collection.py"
}
] | 4degrees/clique | a89507304acce5931f940c34025a6547fa8227b5 | collection.format hits maximum recursion depth for collections with lots of holes.
The following code gives an example.
```python
paths = ["name.{0:04d}.jpg".format(x) for x in range(2000)[::2]]
collection = clique.assemble(paths)[0][0]
collection.format("{head}####{tail}")
``` | diff --git a/source/clique/collection.py b/source/clique/collection.py
index 0c3b296..db9276c 100644
--- a/source/clique/collection.py
+++ b/source/clique/collection.py
@@ -251,15 +251,25 @@ class Collection(object):
else:
data['padding'] = '%d'
- if self.indexes:
+ if '{holes}' in... |
6si__shipwright-79 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"shipwright/base.py:Shipwright.__init__",
"shipwright/base.py:Shipwright._build"
],
"edited_modules": [
"shipwright/base.py:Shipwright"
]
},
"file": "shipwright/... | 6si/shipwright | 7d3ccf39acc79bb6d33a787e773227358764dd2c | docker pull all images for current branch and master before building
Because our buildserver forgets the docker cache between builds we pull the previous build for all the images.
it would be great if we could get shipwright to do it.
Otherwise a command like "shipright images" which lists all the images that shi... | diff --git a/CHANGES.rst b/CHANGES.rst
index f034d37..89cf5f1 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,7 +1,8 @@
0.5.1 (unreleased)
------------------
-- Nothing changed yet.
+- Add --pull-cache to pull images from repository before building.
+ (`Issue #49 <https://github.com/6si/shipwright/issues/49>`_).
... |
AI-SDC__AI-SDC-94 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"aisdc/attacks/report.py:NumpyArrayEncoder.default"
],
"edited_modules": [
"aisdc/attacks/report.py:NumpyArrayEncoder"
]
},
"file": "aisdc/attacks/report.py"
},
{
... | AI-SDC/AI-SDC | a42a2110ade262a7d699d5b71cfccbc787290d5d | Add option to include target model error into attacks as a feature
Whether or not the target model classifies an example correctly provides some signal that could be of use to an attacker. We currently do not use this in the attacks, but should include an option to allow it.
The 0/1 loss between the predicted class ... | diff --git a/aisdc/attacks/report.py b/aisdc/attacks/report.py
index 12b3887..515c709 100644
--- a/aisdc/attacks/report.py
+++ b/aisdc/attacks/report.py
@@ -1,4 +1,5 @@
"""Code for automatic report generation"""
+import abc
import json
import numpy as np
@@ -83,6 +84,8 @@ class NumpyArrayEncoder(json.JSONEncoder):... |
AI4S2S__lilio-58 | [
{
"changes": {
"added_entities": [
"lilio/calendar.py:Calendar.n_precursors"
],
"added_modules": null,
"edited_entities": [
"lilio/calendar.py:Calendar.n_targets"
],
"edited_modules": [
"lilio/calendar.py:Calendar"
]
},
"file": "lilio/cal... | AI4S2S/lilio | 416ea560d41b57502cf204fbf4e65e79c21373bf | Unclear error message when an empty calendar is passed to `resample` function.
When executing the following code:
```py
import lilio
from lilio import Calendar
import xarray as xr
ds = xr.load_dataset(path_to_data) # just some sample data
cal = Calendar("12-25")
# note that no intervals are added to the calen... | diff --git a/lilio/calendar.py b/lilio/calendar.py
index c288ebb..93ee1db 100644
--- a/lilio/calendar.py
+++ b/lilio/calendar.py
@@ -206,10 +206,15 @@ class Calendar:
self._set_mapping(mapping)
@property
- def n_targets(self):
+ def n_targets(self) -> int:
"""Return the number of targets.... |
AI4S2S__s2spy-106 | [
{
"changes": {
"added_entities": [
"s2spy/rgdr/rgdr.py:RGDR.preview_correlation",
"s2spy/rgdr/rgdr.py:RGDR.preview_clusters"
],
"added_modules": null,
"edited_entities": [
"s2spy/rgdr/rgdr.py:RGDR.plot_correlation",
"s2spy/rgdr/rgdr.py:RGDR.plot_clusters"
... | AI4S2S/s2spy | 81682c3a15708eb9fccb705796b134933001afb4 | Merge the plotting functionalities in RGDR module
Currently in RGDR module, there are two ways to plot the clusters:
- Preview the clusters by `RGDR(...).plot_clusters(precursor_field, target_timeseries)`
- Plotting clusters after fitting and transforming `cluster_map.cluster_labels[0].plot()`
The first option was... | diff --git a/notebooks/tutorial_RGDR.ipynb b/notebooks/tutorial_RGDR.ipynb
index 84f5228..0f8213a 100644
--- a/notebooks/tutorial_RGDR.ipynb
+++ b/notebooks/tutorial_RGDR.ipynb
@@ -76,14 +76,27 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "Using `.plot_correlation` we can see the correlation... |
AI4S2S__s2spy-77 | [
{
"changes": {
"added_entities": [
"s2spy/_base_calendar.py:BaseCalendar.set_max_lag"
],
"added_modules": null,
"edited_entities": [
"s2spy/_base_calendar.py:BaseCalendar.__init__",
"s2spy/_base_calendar.py:BaseCalendar._get_nintervals",
"s2spy/_base_calen... | AI4S2S/s2spy | c254e11f1a64aaa52593e3848b71207421a16c61 | Dealing with mismatch between calendar map and data time extent
PR #60 changes resample so that it takes a mapped calendar and resamples the input data based on those intervals.
However, this will require some extra checks and data handling to prevent user error;
- [ ] Add check to see if the data can fully cover... | diff --git a/notebooks/tutorial_time.ipynb b/notebooks/tutorial_time.ipynb
index 149fa99..4cdcf45 100644
--- a/notebooks/tutorial_time.ipynb
+++ b/notebooks/tutorial_time.ipynb
@@ -27,7 +27,7 @@
"name": "stdout",
"output_type": "stream",
"text": [
- "AdventCalendar(month=12, day=31, freq=7d, n_ta... |
AI4S2S__s2spy-83 | [
{
"changes": {
"added_entities": [
"s2spy/rgdr/rgdr.py:RGDR.get_correlation",
"s2spy/rgdr/rgdr.py:RGDR.get_clusters",
"s2spy/rgdr/rgdr.py:RGDR.fit_transform"
],
"added_modules": null,
"edited_entities": [
"s2spy/rgdr/rgdr.py:masked_spherical_dbscan",
... | AI4S2S/s2spy | 1bd615deba811a0b978e2c3abfad6c1de1be8851 | Return correlation / p_values from RGDR
Currently the correlation and p_values are not accessible by the user. The user can only visualize the those values via RGDR built-in plot functions. The users may need to investigate/store these values in case they want to check the correlation of the precursor fields and target... | diff --git a/notebooks/tutorial_RGDR.ipynb b/notebooks/tutorial_RGDR.ipynb
index 198ccea..29a7389 100644
--- a/notebooks/tutorial_RGDR.ipynb
+++ b/notebooks/tutorial_RGDR.ipynb
@@ -53,7 +53,7 @@
"target_timeseries = target_resampled.sel(cluster=3).ts.isel(i_interval=0)\n",
"precursor_field = field_resampled.s... |
ARM-software__mango-42 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"mango/domain/domain_space.py:domain_space.create_mappings"
],
"edited_modules": [
"mango/domain/domain_space.py:domain_space"
]
},
"file": "mango/domain/domain_space.py... | ARM-software/mango | f1b2aaef4b2d6ba5b5ed1667346c1d9cfb708d85 | Variable type issue in domain_space.py file
System: Ubuntu 18.04, Python; 3.8.8 (via conda environment), Tensorflow: 2.4.0 (with GPU), numpy: 1.18.5
Error happens when using TF in conjunction with Mango for neural architecture search.
In https://github.com/ARM-software/mango/blob/master/mango/domain/domain_space.py... | diff --git a/mango/domain/domain_space.py b/mango/domain/domain_space.py
index 86cf9da..b02f833 100644
--- a/mango/domain/domain_space.py
+++ b/mango/domain/domain_space.py
@@ -70,7 +70,7 @@ class domain_space():
pass # we are not doing anything at present, and will directly use its value for GP.
... |
ARMmbed__greentea-237 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"mbed_greentea/mbed_report_api.py:exporter_json",
"mbed_greentea/mbed_report_api.py:exporter_testcase_junit",
"mbed_greentea/mbed_report_api.py:get_result_overlay_dropdowns",
"mbe... | ARMmbed/greentea | 86f5ec3211a8f7f324bcdd3201012945ee0534ac | mbedgt crash with float division by zero
Hi
Here is my command:
mbedgt -V -v -t NUCLEO_F401RE-ARM,NUCLEO_F401RE-GCC_ARM,NUCLEO_F401RE-IAR,NUCLEO_F410RB-ARM,NUCLEO_F410RB-GCC_ARM,NUCLEO_F410RB-IAR,NUCLEO_F411RE-ARM,NUCLEO_F411RE-GCC_ARM,NUCLEO_F411RE-IAR --report-html=/c/xxx.html
It has crashed:
...
mbedgt: a... | diff --git a/mbed_greentea/mbed_report_api.py b/mbed_greentea/mbed_report_api.py
index da3f0d9..82acb5c 100644
--- a/mbed_greentea/mbed_report_api.py
+++ b/mbed_greentea/mbed_report_api.py
@@ -38,6 +38,13 @@ def exporter_json(test_result_ext, test_suite_properties=None):
@details This is a machine friendly format
... |
ARMmbed__greentea-243 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"mbed_greentea/mbed_report_api.py:exporter_json",
"mbed_greentea/mbed_report_api.py:get_result_overlay_dropdowns"
],
"edited_modules": [
"mbed_greentea/mbed_report_api.py:expo... | ARMmbed/greentea | 8f7b28f8ec739156d238304fa4f5f2e5156536f5 | mbedgt crash with UnicodeDecodeError
Hi
I am sorry, but I still get some crash with the new green tea version ...
mbedgt: exporting to HTML file 'C:/mcu/reports/report__mbed_os5_release_non_regression_F756ZG_mbed-os-5.5.7__2017_09_28_00_06.html'...
mbedgt: unexpected error:
'unicodeescape' codec can't decode b... | diff --git a/mbed_greentea/mbed_report_api.py b/mbed_greentea/mbed_report_api.py
index 166bc29..22a3778 100644
--- a/mbed_greentea/mbed_report_api.py
+++ b/mbed_greentea/mbed_report_api.py
@@ -42,7 +42,7 @@ def exporter_json(test_result_ext, test_suite_properties=None):
for suite in target.values():
... |
ARMmbed__greentea-250 | [
{
"changes": {
"added_entities": [
"mbed_greentea/mbed_target_info.py:suppress",
"mbed_greentea/mbed_target_info.py:_get_platform_property_from_default",
"mbed_greentea/mbed_target_info.py:_get_platform_property_from_info_mapping",
"mbed_greentea/mbed_target_info.py:_platfo... | ARMmbed/greentea | b8bcffbb7aaced094f252a4ddfe930e8237fb484 | Target property priority incorrect
Currently we have priority as follows:
```
internal yotta blob > targets.json > tool default
```
This is a bug.
Instead the priority should be:
```
targets.json /w default > internal yotta blob > tool delaut
```
This implies a few test cases:
In targets.json | I... | diff --git a/mbed_greentea/mbed_target_info.py b/mbed_greentea/mbed_target_info.py
index 356676b..c825bcf 100644
--- a/mbed_greentea/mbed_target_info.py
+++ b/mbed_greentea/mbed_target_info.py
@@ -20,6 +20,17 @@ Author: Przemyslaw Wirkus <Przemyslaw.Wirkus@arm.com>
import os
import re
import json
+from os import wal... |
ARMmbed__greentea-263 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"mbed_greentea/mbed_greentea_cli.py:create_filtered_test_list"
],
"edited_modules": [
"mbed_greentea/mbed_greentea_cli.py:create_filtered_test_list"
]
},
"file": "mbed_g... | ARMmbed/greentea | 68508c5f4d7cf0635c75399d0ff7cfa896fdf2cc | Test names are not correctly globbed
Test names only respect a wildcard that is placed at the end of the string. Ex. "mbed-os-*".
However, it does not respect the wildcard anywhere else. Ex. "*-timer"
The build tools accept these wildcards, so greentea should as well. This is the line responsible: https://github.... | diff --git a/mbed_greentea/mbed_greentea_cli.py b/mbed_greentea/mbed_greentea_cli.py
index f6a13c4..446b965 100644
--- a/mbed_greentea/mbed_greentea_cli.py
+++ b/mbed_greentea/mbed_greentea_cli.py
@@ -23,6 +23,7 @@ import os
import sys
import random
import optparse
+import fnmatch
from time import time
try:
f... |
ARMmbed__yotta-802 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"yotta/install.py:installComponentAsDependency"
],
"edited_modules": [
"yotta/install.py:installComponentAsDependency"
]
},
"file": "yotta/install.py"
},
{
"chan... | ARMmbed/yotta | ae1cda2082f6f82c1c9f80f6194fcae62d228bc1 | Install yotta modules from github with git credentials
I'd like to do following but it fails:
```
$ yotta install git@github.com:ARMmbed/module-x.git
info: get versions for git
Fatal Exception, yotta=0.17.2
Traceback (most recent call last):
File "/home/jaakor01/workspace/yotta_issue/venv/bin/yotta", line 4, in... | diff --git a/docs/reference/buildsystem.md b/docs/reference/buildsystem.md
index e728c9f..6cc8bd9 100644
--- a/docs/reference/buildsystem.md
+++ b/docs/reference/buildsystem.md
@@ -30,7 +30,7 @@ The name of the library being built by the current module is available as
No header needs to be included for this definition... |
ARMmbed__yotta-804 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"yotta/lib/sourceparse.py:_getNonRegistryRef"
],
"edited_modules": [
"yotta/lib/sourceparse.py:_getNonRegistryRef"
]
},
"file": "yotta/lib/sourceparse.py"
}
] | ARMmbed/yotta | 4094b7a26c66dd64ff724d4f72da282d41ea9fca | Semver incompatibility
Hi,
It seems the recent version has broken semantic version for any previous version, which we heavily use in our project, [microbit-dal](https://github.com/lancaster-university/microbit-dal).
We have had two new users on v18 who have reported this breakage: https://github.com/lancaster-uni... | diff --git a/yotta/lib/sourceparse.py b/yotta/lib/sourceparse.py
index 0f451ad..eb1f0b4 100644
--- a/yotta/lib/sourceparse.py
+++ b/yotta/lib/sourceparse.py
@@ -57,7 +57,7 @@ def _getNonRegistryRef(source_url):
# something/something#spec = github
# something/something@spec = github
# something/something ... |
ASFHyP3__hyp3-autorift-202 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"hyp3_autorift/io.py:write_geospatial"
],
"edited_modules": [
"hyp3_autorift/io.py:write_geospatial"
]
},
"file": "hyp3_autorift/io.py"
},
{
"changes": {
"... | ASFHyP3/hyp3-autorift | ca9bda5f0a8a5db47d0d83c825a9d447a6c4180e | Landsat 7 + 8 pairs may not be handled well
Since we only look at the reference scene to [determine the platform](https://github.com/ASFHyP3/hyp3-autorift/blob/develop/hyp3_autorift/process.py#L319), we may have some issues with the secondary scene:
- [x] L7 reference scene w/ a L8 secondary will attempt to apply the ... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index e092e35..a9108fb 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -7,6 +7,11 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
+## [0.10.4]
+
+### Fixed
+* Landsat 7+... |
ASFHyP3__hyp3-autorift-49 | [
{
"changes": {
"added_entities": [
"hyp3_autorift/geometry.py:poly_bounds_in_proj"
],
"added_modules": [
"hyp3_autorift/geometry.py:poly_bounds_in_proj"
],
"edited_entities": [
"hyp3_autorift/geometry.py:polygon_from_bbox",
"hyp3_autorift/geometry.py... | ASFHyP3/hyp3-autorift | cf8c6dadd1d4af9cb310a2eb470ac0c2e820c2ab | Get associated autoRIFT files from parameter shapefile
Currently, we hardcode the set of needed files for processing:
https://github.com/ASFHyP3/hyp3-autorift/blob/develop/hyp3_autorift/io.py#L47-L68
These however, at detailed in the new parameter shapefile and could be pulled from there instead to be more flexi... | diff --git a/hyp3_autorift/geometry.py b/hyp3_autorift/geometry.py
index 5b66d76..a2c6a13 100644
--- a/hyp3_autorift/geometry.py
+++ b/hyp3_autorift/geometry.py
@@ -2,19 +2,18 @@
import logging
import os
+from typing import Tuple
import isce # noqa: F401
import isceobj
import numpy as np
from contrib.demUtil... |
ASFHyP3__hyp3-sdk-152 | [
{
"changes": {
"added_entities": [
"hyp3_sdk/jobs.py:Batch.__eq__"
],
"added_modules": null,
"edited_entities": [
"hyp3_sdk/jobs.py:Batch.__delitem__",
"hyp3_sdk/jobs.py:Batch.__reverse__",
"hyp3_sdk/jobs.py:Batch.__getitem__"
],
"edited_module... | ASFHyP3/hyp3-sdk | b3e64fdef9d76d7abb6bd762ae1b8429ebd1e3f5 | slicing a Batch returns a list
Should return a Batch instead.
```
>>> import hyp3_sdk
>>> hyp3 = hyp3_sdk.HyP3()
>>> jobs = hyp3.find_jobs()
>>> type(jobs)
<class 'hyp3_sdk.jobs.Batch'>
>>> len(jobs)
955
>>> type(jobs[3:10])
<class 'list'>
``` | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 2c340f0..55972fc 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -7,7 +7,14 @@ and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
+## [1.4.1](https://github.com/ASFHyP3/... |
ASFHyP3__hyp3-sdk-51 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"hyp3_sdk/hyp3.py:HyP3.__init__"
],
"edited_modules": [
"hyp3_sdk/hyp3.py:HyP3"
]
},
"file": "hyp3_sdk/hyp3.py"
}
] | ASFHyP3/hyp3-sdk | 67e33235f7dc3b98241fe34d97a4fae58873590c | Add custom User Agent header to hyp3 api session
e.g. `User-Agent: hyp3-sdk v0.1.2` so we can identify SDK-generated requests in the API access logs, separate from other requests made via `requests`. | diff --git a/hyp3_sdk/hyp3.py b/hyp3_sdk/hyp3.py
index 7d90095..baf69f4 100644
--- a/hyp3_sdk/hyp3.py
+++ b/hyp3_sdk/hyp3.py
@@ -6,6 +6,7 @@ from urllib.parse import urljoin
from requests.exceptions import HTTPError, RequestException
+import hyp3_sdk
from hyp3_sdk.exceptions import HyP3Error, ValidationError
fro... |
ASFHyP3__hyp3-sdk-53 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"hyp3_sdk/jobs.py:Batch.__init__"
],
"edited_modules": [
"hyp3_sdk/jobs.py:Batch"
]
},
"file": "hyp3_sdk/jobs.py"
}
] | ASFHyP3/hyp3-sdk | 56cfb700341a0de44ee0f2f3548d5ed6c534d659 | Batch constructor should create an empty batch by default
Currently, calling `jobs = Batch()` raises `TypeError: __init__() missing 1 required positional argument: 'jobs'`.
To construct an empty batch, the user has to write `jobs = Batch([])`. It would be more intuitive if this were the default behavior without hav... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 8905268..ddcacaa 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -6,6 +6,14 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning... |
ASFHyP3__hyp3-sdk-70 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"hyp3_sdk/exceptions.py:ValidationError"
]
},
"file": "hyp3_sdk/exceptions.py"
},
{
"changes": {
"added_entities": [
"hyp3_sdk/hyp3.py:Hy... | ASFHyP3/hyp3-sdk | 6e4004e372771dc444bf5f334f1f8e25a39313bf | use fewer requests when submitting multiple jobs
When submitting large jobs, multiple api request are created, would be nice for a way to aggrigate jobs into one request | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 2ef243f..620eb3f 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -6,6 +6,19 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning... |
ASFHyP3__hyp3-sdk-71 | [
{
"changes": {
"added_entities": [
"hyp3_sdk/jobs.py:Batch.__iter__"
],
"added_modules": null,
"edited_entities": [
"hyp3_sdk/jobs.py:Batch.__len__"
],
"edited_modules": [
"hyp3_sdk/jobs.py:Batch"
]
},
"file": "hyp3_sdk/jobs.py"
}
] | ASFHyP3/hyp3-sdk | b8011c957ce5759bd64007c2116d202fdb5a6dae | Batch should be iterable
Attempting to iterate over a Batch object currently fails with `TypeError: 'Batch' object is not iterable`.
```
> import hyp3_sdk
> api = hyp3_sdk.HyP3()
> jobs = api.find_jobs(name='refactor')
> sizes = [job['files'][0]['size'] for job in jobs]
Traceback (most recent call last):
Fil... | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 620eb3f..38529ae 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -15,6 +15,7 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
- `HyP3.prepare_insar_job`
### Changed
+- HyP3 `Batch` objects are now iterable
- HyP3 submit methods will always... |
ASFHyP3__hyp3-sdk-73 | [
{
"changes": {
"added_entities": [
"hyp3_sdk/hyp3.py:HyP3.get_job_by_id"
],
"added_modules": null,
"edited_entities": [
"hyp3_sdk/hyp3.py:HyP3._get_job_by_id",
"hyp3_sdk/hyp3.py:HyP3._refresh_job"
],
"edited_modules": [
"hyp3_sdk/hyp3.py:HyP3"
... | ASFHyP3/hyp3-sdk | 1fec8b5ae4c2cf80392cc6e27a52123e72e320e0 | _get_job_by_id shouldn't be private
https://github.com/ASFHyP3/hyp3-sdk/blob/develop/hyp3_sdk/hyp3.py#L72
Turns out it's pretty useful generally. | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 71619ef..d83763d 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -35,6 +35,7 @@ and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
submit one or more prepared job dictionaries.
- `Job.download_files` and `Batch.download_files` will (optionally) create... |
ASPP__pelita-412 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "pelita/player/__init__.py"
},
{
"changes": {
"added_entities": null,
"added_modules": [
"pelita/player/base.py:SteppingPlayer"
... | ASPP/pelita | 002ae9e325b1608a324d02749205cd70b4f6da2b | pytest warns about our TestPlayer
WC1 /tmp/group1/test/test_drunk_player.py cannot collect test class 'TestPlayer' because it has a __init__ constructor
Maybe rename it? | diff --git a/pelita/player/__init__.py b/pelita/player/__init__.py
index cf429bca..bedaae24 100644
--- a/pelita/player/__init__.py
+++ b/pelita/player/__init__.py
@@ -1,7 +1,7 @@
from .base import AbstractTeam, SimpleTeam, AbstractPlayer
-from .base import (StoppingPlayer, TestPlayer, SpeakingPlayer,
+from .base i... |
ASPP__pelita-619 | [
{
"changes": {
"added_entities": [
"pelita/layout.py:layout_for_team"
],
"added_modules": [
"pelita/layout.py:layout_for_team"
],
"edited_entities": [
"pelita/layout.py:parse_layout",
"pelita/layout.py:parse_single_layout",
"pelita/layout.py:... | ASPP/pelita | 8c4b83c5fcfd1b748af5cfe8b0b09e93ab5a6406 | parse_layout should fail when a bot is defined on different coordinates
from pelita.layout import parse_layout
layout = """
######
#000.#
#.111#
###### """
print(parse_layout(layout)['bots'])
[(3, 1), (4, 2), None, None]
should raise `ValueError` | diff --git a/pelita/layout.py b/pelita/layout.py
index 1ad10138..6fde0227 100644
--- a/pelita/layout.py
+++ b/pelita/layout.py
@@ -9,12 +9,6 @@ except SyntaxError as err:
print("Invalid syntax in __layouts module. Pelita will not be able to use built-in layouts.")
print(err)
-class Layout:
- pass
-
-clas... |
ASPP__pelita-635 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"pelita/game.py:setup_game",
"pelita/game.py:prepare_bot_state",
"pelita/game.py:apply_move"
],
"edited_modules": [
"pelita/game.py:setup_game",
"pelita/game.p... | ASPP/pelita | ffe76f4adeca90e7c9e4542ab61a1deb5081408f | rename Bot.has_respawned
What about renaming `Bot.has_respawned` to `Bot.was_killed`? The word _respawned_ is totally meaningless for most non-hardcore programmers. | diff --git a/pelita/game.py b/pelita/game.py
index 5789a9b3..2dd68761 100644
--- a/pelita/game.py
+++ b/pelita/game.py
@@ -275,7 +275,7 @@ def setup_game(team_specs, *, layout_dict, max_rounds=300, layout_name="", seed=
kills = [0]*4,
# List of boolean flags weather bot has been eaten since its last... |
ASPP__pelita-655 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"pelita/layout.py:parse_layout",
"pelita/layout.py:parse_single_layout",
"pelita/layout.py:layout_as_str"
],
"edited_modules": [
"pelita/layout.py:parse_layout",
... | ASPP/pelita | 1108fc71cdc9a7eeb4563149e9821255d6f56bf3 | print(bot) should show which enemies are noisy
This will hopefully avoid confusion.
One remark: since we got rid of set_initial in the new-style API, the teams never see their enemies sitting unnoised on their initial positions, which has been a nice (and easy) starting point for filtering. Question: Do we want to b... | diff --git a/pelita/layout.py b/pelita/layout.py
index 369da014..df604a43 100644
--- a/pelita/layout.py
+++ b/pelita/layout.py
@@ -117,8 +117,9 @@ def parse_layout(layout_str, allow_enemy_chars=False):
In this case, bot '0' and bot '2' are on top of each other at position (1,1)
If `allow_enemy_chars` is Tru... |
ASPP__pelita-696 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"pelita/layout.py:layout_agnostic"
],
"edited_modules": [
"pelita/layout.py:layout_agnostic"
]
},
"file": "pelita/layout.py"
}
] | ASPP/pelita | 557c3a757a24e0f1abe25f7edf5c4ffee83a077e | layout_agnostic needs tests and fixes
Currently broken. https://github.com/ASPP/pelita/blob/2f17db5355b4dffae8a130ede549ab869b2f1ce2/pelita/layout.py#L548-L566
| diff --git a/pelita/layout.py b/pelita/layout.py
index e5797adc..66fa2ebd 100644
--- a/pelita/layout.py
+++ b/pelita/layout.py
@@ -545,9 +545,9 @@ def layout_for_team(layout, is_blue=True, is_noisy=(False, False)):
'is_noisy' : is_noisy,
}
-def layout_agnostic(layout_for_team, is_blue=True):
- """ Co... |
ASPP__pelita-708 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"check_layout_consistency.py:get_hw",
"check_layout_consistency.py:layout_to_graph"
],
"edited_modules": [
"check_layout_consistency.py:get_hw",
"check_layout_consiste... | ASPP/pelita | 8c47e30cdf8b6dabf173ebfe71170e36b2aaa35e | Add width and height to the layout dictionary
As it is needed in several places, it makes sense to just stick two additional keys into the layout dictionary. Fixing this bug also means getting rid of all `max(walls)[0]+1`, `max(walls)[1]+1`, `sorted(walls)[-1][0]`, `sorted(walls)[-1][1]` spread in the pelita and pelita... | diff --git a/check_layout_consistency.py b/check_layout_consistency.py
index f3e873c6..cb6b9689 100644
--- a/check_layout_consistency.py
+++ b/check_layout_consistency.py
@@ -1,36 +1,11 @@
"""Detect if a layout contains "chambers" with food"""
import sys
-import networkx
+import networkx as nx
from pelita.layout i... |
ASPP__pelita-798 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"pelita/game.py:prepare_bot_state"
],
"edited_modules": [
"pelita/game.py:prepare_bot_state"
]
},
"file": "pelita/game.py"
},
{
"changes": {
"added_entitie... | ASPP/pelita | 245a9ca445eccda07998cb58fbee340308d425a7 | make the cumulative time counter available to clients
The `Bot` object should have an attribute `time` where the cumulative time for the team shown in the GUI is made available. This is useful for example when benchmarking different strategies. There's no way at the moment to programmatically access that value, as far ... | diff --git a/pelita/game.py b/pelita/game.py
index 6cbacc56..efc51f3a 100644
--- a/pelita/game.py
+++ b/pelita/game.py
@@ -555,7 +555,8 @@ def prepare_bot_state(game_state, idx=None):
'bot_was_killed': game_state['bot_was_killed'][own_team::2],
'error_count': len(game_state['errors'][own_team]),
... |
ASPP__pelita-815 | [
{
"changes": {
"added_entities": [
"contrib/ci_engine.py:CI_Engine.get_team_name",
"contrib/ci_engine.py:DB_Wrapper.add_team_name",
"contrib/ci_engine.py:DB_Wrapper.get_team_name"
],
"added_modules": null,
"edited_entities": [
"contrib/ci_engine.py:CI_Engi... | ASPP/pelita | 6e70daf7313ca878e3a46240ca6a20811024e9db | Add team name in CI table
Remote players only show their team names making comparisons harder.
Conversely, the remote player should/could also include the spec name. | diff --git a/contrib/ci_engine.py b/contrib/ci_engine.py
index c1cb84d0..f44194f1 100755
--- a/contrib/ci_engine.py
+++ b/contrib/ci_engine.py
@@ -120,11 +120,15 @@ class CI_Engine:
self.dbwrapper.add_player(pname, new_hash)
for player in self.players:
+ path = player['path']
+ ... |
ASPP__pelita-863 | [
{
"changes": {
"added_entities": [
"pelita/team.py:Bot.__repr__"
],
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"pelita/team.py:Bot"
]
},
"file": "pelita/team.py"
}
] | ASPP/pelita | ab9217f298fa4897b06e5e9e9e7fad7e29ba7114 | Better Bot repr
For reference, this is the current `repr(bot)`:
<pelita.team.Bot object at 0x102778b90>
This is `str(bot)`:
```
Playing on red side. Current turn: 1. Bot: y. Round: 1, score: 0:0. timeouts: 0:0
################################
#. . #. . . #.y#
# # #### ### . # . .x#
# ... | diff --git a/pelita/team.py b/pelita/team.py
index a3ef1138..aea97327 100644
--- a/pelita/team.py
+++ b/pelita/team.py
@@ -726,6 +726,9 @@ class Bot:
out.write(footer)
return out.getvalue()
+ def __repr__(self):
+ return f'<Bot: {self.char} ({"blue" if self.is_blue else "red"}), {s... |
ASPP__pelita-875 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"pelita/utils.py:run_background_game"
],
"edited_modules": [
"pelita/utils.py:run_background_game"
]
},
"file": "pelita/utils.py"
}
] | ASPP/pelita | 68af15d8d4199882d32bb4ede363195e2c5b5a99 | run_background_game breaks when layout= is passed
```
def test_run_background_game_with_layout():
test_layout = (
""" ##################
#a#. . # . #
#b##### #####x#
# . # . .#y#
################## """)
result = utils.run_background_game(blue_move=stopping_play... | diff --git a/pelita/utils.py b/pelita/utils.py
index c5bdf54c..1f3bec36 100644
--- a/pelita/utils.py
+++ b/pelita/utils.py
@@ -104,7 +104,7 @@ def run_background_game(*, blue_move, red_move, layout=None, max_rounds=300, see
if layout is None:
layout_dict = generate_maze(rng=rng)
else:
- layout... |
AVEgame__AVE-108 | [
{
"changes": {
"added_entities": [
"ave/components/numbers.py:Number.get_all_variables",
"ave/components/numbers.py:Constant.get_all_variables",
"ave/components/numbers.py:Sum.get_all_variables",
"ave/components/numbers.py:Product.get_all_variables",
"ave/components... | AVEgame/AVE | 29c1a6f2f58198e3af2bde3b457af6cf8053b6af | Write code to run detailed checks that a game works
Put this in `ave.test`. It can then be used by the AVE pytest tests, and for AVEgame/AVE-usergames#2 | diff --git a/ave/components/numbers.py b/ave/components/numbers.py
index 19fe14d..e321b64 100644
--- a/ave/components/numbers.py
+++ b/ave/components/numbers.py
@@ -10,6 +10,10 @@ class Number:
"""Get the value of the Number."""
raise NotImplementedError()
+ def get_all_variables(self):
+ ... |
AVEgame__AVE-113 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"ave/ave.py:AVE.get_download_menu"
],
"edited_modules": [
"ave/ave.py:AVE"
]
},
"file": "ave/ave.py"
},
{
"changes": {
"added_entities": null,
"added... | AVEgame/AVE | b39a5ed00692e456e4d2533cde44e46830cc90a2 | Check that running AVE version is high enough to play library games
I think I already did this, but needs testing. | diff --git a/ave/ave.py b/ave/ave.py
index 52fb295..cb4a94d 100644
--- a/ave/ave.py
+++ b/ave/ave.py
@@ -135,16 +135,12 @@ class AVE:
A list of the title, author and local url for each game.
"""
try:
- the_json = load_library_json()
+ library = load_library_json()
... |
AVEgame__AVE-96 | [
{
"changes": {
"added_entities": [
"ave/__main__.py:make_json"
],
"added_modules": [
"ave/__main__.py:make_json"
],
"edited_entities": [
"ave/__main__.py:run"
],
"edited_modules": [
"ave/__main__.py:run"
]
},
"file": "ave/__... | AVEgame/AVE | 8aad627bf790ca8e452426d7fba5d74ecb75f0a3 | Create a built in games manifest
We should have a JSON manifest of all built in games constructed before each release. This manifest could then be read to determine the default games on the menu.
A similar strategy could be used for games hosted online. Each time a game is uploaded, it is added to the online manifes... | diff --git a/.gitignore b/.gitignore
index cd234b2..3e2b7e2 100644
--- a/.gitignore
+++ b/.gitignore
@@ -5,3 +5,4 @@
build
dist
+gamelist.json
diff --git a/ave/__main__.py b/ave/__main__.py
index 5d8e0ad..971ab74 100644
--- a/ave/__main__.py
+++ b/ave/__main__.py
@@ -1,10 +1,33 @@
"""Functions to run AVE."""
+im... |
Aarhus-Psychiatry-Research__timeseriesflattener-106 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/timeseriesflattener/feature_spec_objects.py:load_df_with_cache",
"src/timeseriesflattener/feature_spec_objects.py:resolve_from_dict_or_registry"
],
"edited_modules": [
"s... | Aarhus-Psychiatry-Research/timeseriesflattener | c1a3947a2e1993aa336075856021d009f8a11ad8 | loader_kwargs in feature_spec_object classes
loader_kwargs not specified as argument in all spec classes | diff --git a/src/timeseriesflattener/feature_spec_objects.py b/src/timeseriesflattener/feature_spec_objects.py
index 5315137..5f2682f 100644
--- a/src/timeseriesflattener/feature_spec_objects.py
+++ b/src/timeseriesflattener/feature_spec_objects.py
@@ -23,7 +23,7 @@ log = logging.getLogger(__name__)
@cache
def load_d... |
Aarhus-Psychiatry-Research__timeseriesflattener-186 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/timeseriesflattener/feature_spec_objects.py:TemporalSpec.get_col_str"
],
"edited_modules": [
"src/timeseriesflattener/feature_spec_objects.py:TemporalSpec",
"src/timeseri... | Aarhus-Psychiatry-Research/timeseriesflattener | bb6a7fffb2a520272fcb5d7129957ce22484ff77 | fix: change type hints in specs to allow for floats in interval_days
Currenty, float inputs to interval_days args in predictor and outcome specs are coerced into integers. Thus, it is not possible to generate predictors/outcomes with non-integer lookbehind/ahead windows.
- [ ] Add test | diff --git a/.github/workflows/static_type_checks.yml b/.github/workflows/static_type_checks.yml
index abf1bb5..620427d 100644
--- a/.github/workflows/static_type_checks.yml
+++ b/.github/workflows/static_type_checks.yml
@@ -32,7 +32,7 @@ jobs:
uses: actions/setup-python@v4
id: setup_python
w... |
Aarhus-Psychiatry-Research__timeseriesflattener-33 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "src/timeseriesflattener/__init__.py"
},
{
"changes": {
"added_entities": [
"src/timeseriesflattener/flattened_dataset.py:TimeseriesFlatt... | Aarhus-Psychiatry-Research/timeseriesflattener | 0b6895a23bd620615b06e442d0887bc73f345540 | Refactor: Main class TSFLattener, add df_getter method | diff --git a/src/timeseriesflattener/__init__.py b/src/timeseriesflattener/__init__.py
index c4eae1f..1516176 100644
--- a/src/timeseriesflattener/__init__.py
+++ b/src/timeseriesflattener/__init__.py
@@ -1,2 +1,2 @@
"""Init timeseriesflattener."""
-from .flattened_dataset import FlattenedDataset
+from .flattened_data... |
Aarhus-Psychiatry-Research__timeseriesflattener-337 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"src/timeseriesflattener/feature_specs/group_specs.py:PredictorGroupSpec",
"src/timeseriesflattener/feature_specs/group_specs.py:OutcomeGroupSpec"
]
},
... | Aarhus-Psychiatry-Research/timeseriesflattener | 0f21608e4c3743545c6aadb844493c47e895fc20 | feat: add option to specify time range in predictions
Creating features in bins of e.g. 0-7, 7-30, 30-90, 90-365, 365-... days from prediction time instead of always going from 0-n as we do now, could potentially keep more temporal information and create better predictors.
| diff --git a/docs/tutorials/01_basic.ipynb b/docs/tutorials/01_basic.ipynb
index 9bdd35d..a354fae 100644
--- a/docs/tutorials/01_basic.ipynb
+++ b/docs/tutorials/01_basic.ipynb
@@ -52,6 +52,15 @@
"execution_count": 1,
"metadata": {},
"outputs": [
+ {
+ "name": "stderr",
+ "output_type": "stream",... |
Aarhus-Psychiatry-Research__timeseriesflattener-48 | [
{
"changes": {
"added_entities": [
"src/timeseriesflattener/feature_spec_objects.py:PredictorSpec.get_cutoff_date",
"src/timeseriesflattener/feature_spec_objects.py:OutcomeSpec.get_cutoff_date"
],
"added_modules": null,
"edited_entities": [
"src/timeseriesflattene... | Aarhus-Psychiatry-Research/timeseriesflattener | 35831b493805d0c33ad447d8a8d8f868e77f8d68 | feat: drop if insufficient lookbehind or lookahead
Based on min in values_df?
Should be based on max in values_df.
1. Refactor to collect all specs before adding
2. Have one shared mechanism of adding specs, and allow it to take a list or a single spec at a time
3. Get the latest required date for sufficient lookbehi... | diff --git a/src/timeseriesflattener/feature_spec_objects.py b/src/timeseriesflattener/feature_spec_objects.py
index c010b0a..c4a3e81 100644
--- a/src/timeseriesflattener/feature_spec_objects.py
+++ b/src/timeseriesflattener/feature_spec_objects.py
@@ -135,8 +135,17 @@ class AnySpec(BaseModel):
def __init__(self, ... |
Aarhus-Psychiatry-Research__timeseriesflattener-54 | [
{
"changes": {
"added_entities": [
"src/timeseriesflattener/feature_spec_objects.py:resolve_from_dict_or_registry",
"src/timeseriesflattener/feature_spec_objects.py:MinGroupSpec._check_loaders_are_valid"
],
"added_modules": [
"src/timeseriesflattener/feature_spec_object... | Aarhus-Psychiatry-Research/timeseriesflattener | ef219b6b829c1f29dabbe13b503fca12adaaeaad | feat: take multiple features as long format | diff --git a/src/timeseriesflattener/feature_spec_objects.py b/src/timeseriesflattener/feature_spec_objects.py
index 0ae0db2..e739fa9 100644
--- a/src/timeseriesflattener/feature_spec_objects.py
+++ b/src/timeseriesflattener/feature_spec_objects.py
@@ -12,7 +12,7 @@ from pydantic import BaseModel as PydanticBaseModel
... |
Aarhus-Psychiatry-Research__timeseriesflattener-59 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"src/timeseriesflattener/feature_spec_objects.py:TemporalSpec"
]
},
"file": "src/timeseriesflattener/feature_spec_objects.py"
},
{
"changes": {
"... | Aarhus-Psychiatry-Research/timeseriesflattener | 505b6c86f16299ce5643c4eb2e12f0a444a4394b | refactor: remove mentions of dw_ek_borger (waiting for no open PRs, very likely to cause conflicts)
1. Rename all occurences of "dw_ek_borger" to "entity_id"
2. Remove it as a default in Timeseriesflattener | diff --git a/CHANGELOG.md b/CHANGELOG.md
index 2d9d748..5ba85d0 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -2,13 +2,6 @@
<!--next-version-placeholder-->
-## v0.19.0 (2022-12-08)
-### Feature
-* More informative errors ([`3141487`](https://github.com/Aarhus-Psychiatry-Research/timeseriesflattener/commit/314148... |
Aarhus-Psychiatry-Research__timeseriesflattener-62 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/timeseriesflattener/feature_spec_objects.py:TemporalSpec.__init__",
"src/timeseriesflattener/feature_spec_objects.py:PredictorSpec.get_cutoff_date",
"src/timeseriesflattener/feature_... | Aarhus-Psychiatry-Research/timeseriesflattener | 734c38b5e8fda8c5643ad389c00a734aeab82900 | fix: remove hardcoded timestamp names | diff --git a/src/timeseriesflattener/feature_spec_objects.py b/src/timeseriesflattener/feature_spec_objects.py
index b498393..a1726bf 100644
--- a/src/timeseriesflattener/feature_spec_objects.py
+++ b/src/timeseriesflattener/feature_spec_objects.py
@@ -232,9 +232,6 @@ class TemporalSpec(AnySpec):
id_col_name: str ... |
AbhinavOmprakash__py-htminify-8 | [
{
"changes": {
"added_entities": [
"htminify/htminify.py:_protect_text",
"htminify/htminify.py:_substitute_with_hex_value",
"htminify/htminify.py:_reintroduce_protected_text"
],
"added_modules": [
"htminify/htminify.py:_protect_text",
"htminify/htminify.... | AbhinavOmprakash/py-htminify | cd30ff52a48f28233d17709f4f36f14c206532ff | code blocks don't render properly

| diff --git a/README.rst b/README.rst
index e22cf54..b8994d7 100644
--- a/README.rst
+++ b/README.rst
@@ -7,7 +7,7 @@ ________
* Using a web framework, like django, flask, and pyramid? We got you covered.
* Or you're feeling adventurous and you're building your own wsgi app? We got you covered there too. This will w... |
Abjad__abjad-ext-nauert-24 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"abjadext/nauert/gracehandlers.py:ConcatenatingGraceHandler.__init__"
],
"edited_modules": [
"abjadext/nauert/gracehandlers.py:ConcatenatingGraceHandler"
]
},
"file": "a... | Abjad/abjad-ext-nauert | 520f389f06e21ee0a094016b4f1e2b0cb58263c1 | Check gracehandlers behaviors
There seem to be some odd behaviors in handling grace notes.
The first odd behavior results in a "grace rest" attaching to a pitched note, as shown below:
```
import abjad
from abjadext import nauert
quantizer = nauert.Quantizer()
durations = [1000, 1, 999]
pitches = [0, None, 0... | diff --git a/abjadext/nauert/gracehandlers.py b/abjadext/nauert/gracehandlers.py
index 8813e0f..a2dbdd3 100644
--- a/abjadext/nauert/gracehandlers.py
+++ b/abjadext/nauert/gracehandlers.py
@@ -199,8 +199,8 @@ class ConcatenatingGraceHandler(GraceHandler):
.. container:: example
- When ``replace_rest_wi... |
ActivisionGameScience__assertpy-55 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "assertpy/__init__.py"
},
{
"changes": {
"added_entities": [
"assertpy/assertpy.py:soft_assertions",
"assertpy/assertpy.py:assert... | ActivisionGameScience/assertpy | ed43bee91eadd55f6cc9004e6f3862a97e0d2190 | correct implementation of soft assertions
Hi!
This is not a bug report, but more like a discussion kick-starter regarding soft assertions. And if we happen to agree on a different implementation, I'll be more than happy to create a PR.
What I suggest is soft assertions to be implemented as in other languages lib... | diff --git a/README.md b/README.md
index 91b1eb5..99edf06 100644
--- a/README.md
+++ b/README.md
@@ -282,7 +282,7 @@ Fluent assertions against the value of a given key can be done by prepending `ha
```py
fred = {'first_name': 'Fred', 'last_name': 'Smith', 'shoe_size': 12}
-
+
assert_that(fred).has_first_name('Fre... |
Adyen__adyen-python-api-library-276 | [
{
"changes": {
"added_entities": [
"Adyen/client.py:AdyenClient._raise_http_error"
],
"added_modules": null,
"edited_entities": [
"Adyen/client.py:AdyenClient._set_url_version",
"Adyen/client.py:AdyenClient._handle_response",
"Adyen/client.py:AdyenClient._... | Adyen/adyen-python-api-library | 72bd79756c6fe5de567e7ca0e61b27d304d7e8c0 | `TerminalsTerminalLevelApi.reassign_terminal` throws JSONDecodeError
**Describe the bug**
All calls to `TerminalsTerminalLevelApi.reassign_terminal` throw a JSONDecodeError
**To Reproduce**
```python
from Adyen import AdyenClient
from Adyen.services.management import TerminalsTerminalLevelApi
API_KEY = ... | diff --git a/Adyen/__init__.py b/Adyen/__init__.py
index 712155e..3e9a8a8 100644
--- a/Adyen/__init__.py
+++ b/Adyen/__init__.py
@@ -1,5 +1,3 @@
-#!/bin/python
-
from __future__ import absolute_import, division, unicode_literals
from . import util
diff --git a/Adyen/client.py b/Adyen/client.py
index cd45b98..2e40e9... |
AgentOps-AI__AgentStack-77 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "agentstack/cli/__init__.py"
},
{
"changes": {
"added_entities": [
"agentstack/cli/cli.py:configure_default_model"
],
"added_... | AgentOps-AI/AgentStack | c2725af63fefa393169f30be0689f2b4f3f0e4b3 | Dynamically load model providers
In the agent wizard section of the CLI, it asks to enter the model and provider for the agent to use.
Any provider/model that works in LiteLLM should be accepted.
Import or create a list of all acceptable providers and associated models.
In the AgentWizard, ask the user to sele... | diff --git a/agentstack/cli/__init__.py b/agentstack/cli/__init__.py
index 3c35ec3..afd42af 100644
--- a/agentstack/cli/__init__.py
+++ b/agentstack/cli/__init__.py
@@ -1,1 +1,1 @@
-from .cli import init_project_builder, list_tools
+from .cli import init_project_builder, list_tools, configure_default_model
diff --git a... |
Akkudoktor-EOS__EOS-459 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/akkudoktoreos/utils/visualize.py:prepare_visualize"
],
"edited_modules": [
"src/akkudoktoreos/utils/visualize.py:prepare_visualize"
]
},
"file": "src/akkudoktoreos/... | Akkudoktor-EOS/EOS | d912561bfbe5c1c97505f89225c3f9650b00c3c7 | [BUG]: Exception in visualize
### Describe the issue:
optimize results in exception.
### Reproduceable code example:
```python
# report.create_line_chart_date( ... | diff --git a/src/akkudoktoreos/utils/visualize.py b/src/akkudoktoreos/utils/visualize.py
index fc684c9..51ccc63 100644
--- a/src/akkudoktoreos/utils/visualize.py
+++ b/src/akkudoktoreos/utils/visualize.py
@@ -454,7 +454,9 @@ def prepare_visualize(
[
np.full(
len(parameters.ems.ges... |
Alexei-Kornienko__schematics_to_swagger-7 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"schematics_to_swagger/__init__.py:model_to_definition"
],
"edited_modules": [
"schematics_to_swagger/__init__.py:model_to_definition"
]
},
"file": "schematics_to_swagge... | Alexei-Kornienko/schematics_to_swagger | 3ddc537a8ed7682e9bb709ebd749b99d7ef09473 | Hide private model fields in swagger doc | diff --git a/schematics_to_swagger/__init__.py b/schematics_to_swagger/__init__.py
index d108f3f..d203de0 100644
--- a/schematics_to_swagger/__init__.py
+++ b/schematics_to_swagger/__init__.py
@@ -54,17 +54,24 @@ def _map_schematics_type(t):
def model_to_definition(model):
- fields = model.fields.items()
+ p... |
AlexisBRENON__ewmh_m2m-15 | [
{
"changes": {
"added_entities": [
"src/ewmh_m2m/geometry.py:Geometry.horizontally_overlap",
"src/ewmh_m2m/geometry.py:Geometry.vertically_overlap",
"src/ewmh_m2m/geometry.py:Geometry.overlap"
],
"added_modules": null,
"edited_entities": null,
"edited_module... | AlexisBRENON/ewmh_m2m | c70bb48fd102fc526112f4cfb7c33ae157d83037 | No sibling screen found - Error
Thanks for the great script! Unfortunately this is what I get:
$ move-to-monitor -v -v
DEBUG:ewmh_m2m.__main__:Detected screens: {Geometry(2960, 0, 1920, 1200), Geometry(0, 176, 1280, 1024), Geometry(1280, 150, 1680, 1050)}
DEBUG:ewmh_m2m.__main__:Containing screen: Geom... | diff --git a/.github/workflows/python.yml b/.github/workflows/python.yml
index 49e0dde..e6b449e 100644
--- a/.github/workflows/python.yml
+++ b/.github/workflows/python.yml
@@ -19,7 +19,7 @@ jobs:
- name: Set up pip
run: |
python -m pip install --upgrade pip
- pip install --upgrade s... |
Algebra8__pyopenapi3-80 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/pyopenapi3/builders.py:PathItemBuilder.__call__",
"src/pyopenapi3/builders.py:ParamBuilder.__init__",
"src/pyopenapi3/builders.py:ComponentBuilder.__call__",
"src/pyopenapi3/... | Algebra8/pyopenapi3 | 2237b16747c446adc2b67a080040f222c0493653 | Make parse_name_and_type_from_fmt_str return reference to components
`pyopenapi3.utils.parse_name_and_type_from_fmt_str` should be able to accept the following format: `{name:Component}`.
Currently, it will only consider data types:
```
# In parse_name_and_type_from_fmt_str
yield arg_name, getattr(pyopenapi3.da... | diff --git a/.gitignore b/.gitignore
index 8b5025d..45bfdbc 100644
--- a/.gitignore
+++ b/.gitignore
@@ -6,3 +6,4 @@ __pycache__/
pyopenapi3.egg-info/
dist/
build/
+.vscode/
diff --git a/src/pyopenapi3/builders.py b/src/pyopenapi3/builders.py
index 2fce8e4..c393181 100644
--- a/src/pyopenapi3/builders.py
+++ b/src/p... |
Algebra8__pyopenapi3-83 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"src/pyopenapi3/schemas.py:ObjectsDTSchema"
]
},
"file": "src/pyopenapi3/schemas.py"
}
] | Algebra8/pyopenapi3 | 2ef34c3213eb292703e0e5e6f2185b1e4725bbde | Allow Free-Form Objects
Consider the following example:
```
definitions:
Pet:
type: object
properties:
tags:
type: object
description: Custom tags
```
According to [Open API 3 specs on data types](https://swagger.io/docs/specification/data-models/data-types/#object), "a fre... | diff --git a/src/pyopenapi3/schemas.py b/src/pyopenapi3/schemas.py
index 141822f..daa78f6 100644
--- a/src/pyopenapi3/schemas.py
+++ b/src/pyopenapi3/schemas.py
@@ -164,7 +164,9 @@ class DTSchema(SchemaObject):
class ObjectsDTSchema(DTSchema):
type: str = Field('object', const=True)
- properties: Dict[str, U... |
Algebra8__pyopenapi3-91 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"src/pyopenapi3/builders.py:ComponentBuilder.__call__",
"src/pyopenapi3/builders.py:ComponentBuilder._parameters"
],
"edited_modules": [
"src/pyopenapi3/builders.py:ComponentB... | Algebra8/pyopenapi3 | 1637ca6ab3186f73baaf37b09594f59a745f63bb | Unresolvable pointer due to wrong reference
Consider the following example:
```
@component.parameter
class PetId:
name = "pet_id"
description = "Pet's Unique Identifier"
in_field = "path"
schema = create_schema(String, pattern="^[a-zA-Z0-9-]+$")
required = True
@open_bldr.path
cl... | diff --git a/src/pyopenapi3/builders.py b/src/pyopenapi3/builders.py
index 164f041..a4b1663 100644
--- a/src/pyopenapi3/builders.py
+++ b/src/pyopenapi3/builders.py
@@ -5,7 +5,7 @@ import re
import yaml
-from pyopenapi3.data_types import Component
+from pyopenapi3.data_types import Component, Parameters, Schemas
... |
Algebra8__pyopenapi3-92 | [
{
"changes": {
"added_entities": [
"examples/connexion_example/ex.py:get_pets",
"examples/connexion_example/ex.py:get_pet",
"examples/connexion_example/ex.py:put_get",
"examples/connexion_example/ex.py:delete_pet"
],
"added_modules": [
"examples/connexio... | Algebra8/pyopenapi3 | 2f282f7f121550c845f77b16076b2bdf9b0b379f | Add connexion structure to /examples directory
Include everything that is required to make a simple connexion example work around #62, such as `app.py`. | diff --git a/examples/connexion_example/ex.py b/examples/connexion_example/app.py
similarity index 62%
rename from examples/connexion_example/ex.py
rename to examples/connexion_example/app.py
index 18e2a3e..42b4eb3 100644
--- a/examples/connexion_example/ex.py
+++ b/examples/connexion_example/app.py
@@ -1,8 +1,24 @@
-f... |
All-Hands-AI__OpenHands-4154 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"openhands/core/config/sandbox_config.py:SandboxConfig"
]
},
"file": "openhands/core/config/sandbox_config.py"
},
{
"changes": {
"added_entities"... | All-Hands-AI/OpenHands | c8a933590ac9bd55aa333940bacd4e323eff34bc | [Bug]: Runtime failed to build due to Mamba Error
### Is there an existing issue for the same bug?
- [X] I have checked the troubleshooting document at https://docs.all-hands.dev/modules/usage/troubleshooting
- [X] I have checked the existing issues.
### Describe the bug
```
Traceback (most recent call last):
Fi... | diff --git a/.github/workflows/ghcr-build.yml b/.github/workflows/ghcr-build.yml
index 34824777..82c30d98 100644
--- a/.github/workflows/ghcr-build.yml
+++ b/.github/workflows/ghcr-build.yml
@@ -293,7 +293,7 @@ jobs:
SANDBOX_RUNTIME_CONTAINER_IMAGE=$image_name \
TEST_IN_CI=true \
RUN_AS... |
All-Hands-AI__openhands-aci-17 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"openhands_aci/editor/editor.py:OHEditor.str_replace"
],
"edited_modules": [
"openhands_aci/editor/editor.py:OHEditor"
]
},
"file": "openhands_aci/editor/editor.py"
}
... | All-Hands-AI/openhands-aci | b551afd07cc9d84ee0322c3334dae3bcd3ee00ea | [Bug]: Editing Error "No replacement was performed" is not informative enough
Cross post from https://github.com/All-Hands-AI/OpenHands/issues/5365 | diff --git a/openhands_aci/editor/editor.py b/openhands_aci/editor/editor.py
index 0cbb0a1..e98354b 100644
--- a/openhands_aci/editor/editor.py
+++ b/openhands_aci/editor/editor.py
@@ -110,12 +110,17 @@ class OHEditor:
f'No replacement was performed, old_str `{old_str}` did not appear verbatim in {path... |
All-Hands-AI__openhands-resolver-123 | [
{
"changes": {
"added_entities": [
"openhands_resolver/send_pull_request.py:branch_exists"
],
"added_modules": [
"openhands_resolver/send_pull_request.py:branch_exists"
],
"edited_entities": [
"openhands_resolver/send_pull_request.py:send_pull_request"
... | All-Hands-AI/openhands-resolver | 84ccb9b29d786c3cb16f100b31ee456ddc622fd0 | Better handling of when a branch already exists
Currently, in `github_resolver/send_pull_request.py`, when pushing to github for a particular issue, the branch name is fixed here:
https://github.com/All-Hands-AI/openhands-resolver/blob/44aa2907d70852b7f98786c673304cc18b76d43e/openhands_resolver/send_pull_request.py#L1... | diff --git a/openhands_resolver/send_pull_request.py b/openhands_resolver/send_pull_request.py
index dd227b7..3d06d3e 100644
--- a/openhands_resolver/send_pull_request.py
+++ b/openhands_resolver/send_pull_request.py
@@ -139,6 +139,11 @@ def make_commit(repo_dir: str, issue: GithubIssue) -> None:
raise Runtime... |
All-Hands-AI__openhands-resolver-124 | [
{
"changes": {
"added_entities": [
"openhands_resolver/send_pull_request.py:main"
],
"added_modules": [
"openhands_resolver/send_pull_request.py:main"
],
"edited_entities": null,
"edited_modules": null
},
"file": "openhands_resolver/send_pull_request.p... | All-Hands-AI/openhands-resolver | 6a547e11e71659cd97f776157e68b21277b25359 | Add end-to-end tests for `send_pull_request.py`
Currently, there are no tests to make sure that argument parsing, etc. are working properly in `openhands_resolver/send_pull_request.py`.
In order to fix this, we can do the following:
1. Move the entirety of the content after `if __name__ == "__main__":` to a new `ma... | diff --git a/openhands_resolver/send_pull_request.py b/openhands_resolver/send_pull_request.py
index 1b58e14..dd227b7 100644
--- a/openhands_resolver/send_pull_request.py
+++ b/openhands_resolver/send_pull_request.py
@@ -285,7 +285,7 @@ def process_all_successful_issues(
)
-if __name__ == "__main__":
+... |
All-Hands-AI__openhands-resolver-137 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": null,
"edited_modules": null
},
"file": "openhands_resolver/__init__.py"
},
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"openh... | All-Hands-AI/openhands-resolver | 63dd2c9c905a375db53785c0a69e56951a279189 | "Unterminated quoted string"
On a run of the github action, the following error was encountered:
```bash
Run if [ "true" == "true" ]; then
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/opt/hostedtoolcache/Pyth... | diff --git a/openhands_resolver/__init__.py b/openhands_resolver/__init__.py
index 1276d02..0a8da88 100644
--- a/openhands_resolver/__init__.py
+++ b/openhands_resolver/__init__.py
@@ -1,1 +1,1 @@
-__version__ = "0.1.5"
+__version__ = "0.1.6"
diff --git a/openhands_resolver/send_pull_request.py b/openhands_resolver/sen... |
Altran-PT-GDC__Robot-Framework-Mainframe-3270-Library-93 | [
{
"changes": {
"added_entities": null,
"added_modules": null,
"edited_entities": [
"Mainframe3270/x3270.py:x3270._process_args"
],
"edited_modules": [
"Mainframe3270/x3270.py:x3270"
]
},
"file": "Mainframe3270/x3270.py"
}
] | Altran-PT-GDC/Robot-Framework-Mainframe-3270-Library | 2b1b7717383044d4112e699e8cff5c456a4c9c49 | Enable shell-like syntax for `extra_args` from file
With the current implementation of `x3270._process_args` arguments from a file are split by whitespaces, e.g.
```txt
# argfile.txt
-charset french
```
becomes ["-charset", "french"].
There are, however, resources that allow whitespace between the arguments, ... | diff --git a/Mainframe3270/x3270.py b/Mainframe3270/x3270.py
index 9b84413..b39724c 100644
--- a/Mainframe3270/x3270.py
+++ b/Mainframe3270/x3270.py
@@ -1,5 +1,6 @@
import os
import re
+import shlex
import socket
import time
from datetime import timedelta
@@ -64,12 +65,13 @@ class x3270(object):
`extra_ar... |
AmiiThinks__driving_gridworld-13 | [
{
"changes": {
"added_entities": [
"driving_gridworld/road.py:Road.speed_limit"
],
"added_modules": null,
"edited_entities": [
"driving_gridworld/road.py:Road.__init__",
"driving_gridworld/road.py:Road.successors"
],
"edited_modules": [
"drivin... | AmiiThinks/driving_gridworld | fbc47c68cfade4e7d95ba59a3990dfef196389a6 | Enforce a hard limit on the speed limit in `Road` to the number of rows + 1
If the speed limit is larger than this, then the physical plausibility of the similar breaks, because the number of possible obstacle encounters across a fixed distance can depend on the car's speed and the range of its headlights (the number o... | diff --git a/driving_gridworld/road.py b/driving_gridworld/road.py
index cb519ef..559362f 100644
--- a/driving_gridworld/road.py
+++ b/driving_gridworld/road.py
@@ -142,13 +142,12 @@ def combinations(iterable, r, collection=tuple):
class Road(object):
- def __init__(self, num_rows, car, obstacles, speed_limit):... |
AnalogJ__lexicon-264 | [
{
"changes": {
"added_entities": [
"lexicon/__main__.py:generate_table_result",
"lexicon/__main__.py:handle_output"
],
"added_modules": [
"lexicon/__main__.py:generate_table_result",
"lexicon/__main__.py:handle_output"
],
"edited_entities": [
... | AnalogJ/lexicon | 59a1372a2ba31204f77a8383d0880ba62e0e6607 | [CLI] Pretty output for list method
Is there any plans to have pretty outputs (table or at least formatted) for the ```list``` operation on the CLI?
Right now, the CLI assumes a verbosity of DEBUG level, and outputs the Python representation of the result (managed by the provider). If --log_level=ERROR is used, no o... | diff --git a/lexicon/__main__.py b/lexicon/__main__.py
index d674809e..ad243f18 100644
--- a/lexicon/__main__.py
+++ b/lexicon/__main__.py
@@ -7,6 +7,7 @@ import importlib
import logging
import os
import sys
+import json
import pkg_resources
@@ -19,16 +20,19 @@ logger = logging.getLogger(__name__)
def BasePr... |
AnalogJ__lexicon-336 | [
{
"changes": {
"added_entities": [
"lexicon/cli.py:generate_list_table_result",
"lexicon/cli.py:generate_table_results"
],
"added_modules": [
"lexicon/cli.py:generate_list_table_result",
"lexicon/cli.py:generate_table_results"
],
"edited_entities": [... | AnalogJ/lexicon | 27106bded0bfa8d44ffe3f449ca2e4871588be0f | Memset provider: TypeError: string indices must be integers
Hi,
When using the Memset provider with the default table formatting I get this error:
```bash
$ lexicon memset create example.com TXT --name _acme-challenge.example.com --content BLAH --ttl 300
Traceback (most recent call last):
File "/usr/local/bi... | diff --git a/lexicon/cli.py b/lexicon/cli.py
index dbef1ae2..0b5425ce 100644
--- a/lexicon/cli.py
+++ b/lexicon/cli.py
@@ -14,12 +14,10 @@ from lexicon.parser import generate_cli_main_parser
logger = logging.getLogger(__name__) # pylint: disable=C0103
-def generate_table_result(lexicon_logger, output=None, withou... |
AngryMaciek__angry-moran-simulator-24 | [
{
"changes": {
"added_entities": [
"moranpycess/MoranProcess.py:MoranProcess.PlotSize",
"moranpycess/MoranProcess.py:MoranProcess.PlotAvgBirthPayoff",
"moranpycess/MoranProcess.py:MoranProcess.PlotAvgDeathPayoff",
"moranpycess/MoranProcess.py:MoranProcess.PlotBirthFitness",... | AngryMaciek/angry-moran-simulator | a065091015628bd568f9168b3abf3d8c84167be7 | Python modularisation
double-check the modularisation setup in the `init`. | diff --git a/.github/workflows/lint.yml b/.github/workflows/lint.yml
index fbc60d7..db7d90e 100644
--- a/.github/workflows/lint.yml
+++ b/.github/workflows/lint.yml
@@ -51,7 +51,7 @@ jobs:
run: |
flake8 --max-line-length=88 --ignore F401 moranpycess/__init__.py
flake8 --max-line-length=88... |
ApptuitAI__apptuit-py-10 | [
{
"changes": {
"added_entities": [
"apptuit/apptuit_client.py:DataPoint.value"
],
"added_modules": null,
"edited_entities": null,
"edited_modules": [
"apptuit/apptuit_client.py:DataPoint"
]
},
"file": "apptuit/apptuit_client.py"
}
] | ApptuitAI/apptuit-py | 65d256693243562917c4dfd0e8a753781b153b36 | DataPoint should validate parameters
Right now DataPoint does not validate if the "value" parameter is int/long or float. Eventual API call fails if the value is a string (even representation of int/float).
DataPoint should perform client side validation of all input parameters (metricname, tags, values) without wa... | diff --git a/apptuit/apptuit_client.py b/apptuit/apptuit_client.py
index afa6792..2049aa9 100644
--- a/apptuit/apptuit_client.py
+++ b/apptuit/apptuit_client.py
@@ -286,6 +286,23 @@ class DataPoint(object):
raise ValueError("Tag value %s contains an invalid character, allowed characters are a-z, A-Z, 0... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.