Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Normalize implicit concatenated f-string quotes per part #13539

Merged
merged 4 commits into from
Oct 8, 2024

Conversation

MichaReiser
Copy link
Member

@MichaReiser MichaReiser commented Sep 27, 2024

This PR fixes a bug where ruff decided on whether the quotes for an implicit concatenated string can be changed for the entire expression instead of deciding on the part-level.

Input

_ = (
    'This string should change its quotes to double quotes'
    f'This string uses double quotes in an expression {"woah"}'
    f'This f-string does not use any quotes.'
)

Stable

_ = (
    'This string should change its quotes to double quotes'
    f'This string uses double quotes in an expression {"woah"}'
    f'This f-string does not use any quotes.'
)

Notice how the formatter doesn't change any quotes only because one f-string has an expression containing double quotes.

Preview

_ = (
    "This string should change its quotes to double quotes"
    f'This string uses double quotes in an expression {"woah"}'
    f"This f-string does not use any quotes."
)

This matches black and how Ruff normalizes quotes for regular string literals (the decision is made per literal and for the entire implicit concatenated string expression)

@MichaReiser MichaReiser added the internal An internal refactor or improvement label Sep 27, 2024
Copy link
Contributor

github-actions bot commented Sep 27, 2024

ruff-ecosystem results

Formatter (stable)

✅ ecosystem check detected no format changes.

Formatter (preview)

ℹ️ ecosystem check detected format changes. (+93 -93 lines in 36 files in 3 projects; 51 projects unchanged)

langchain-ai/langchain (+3 -3 lines across 1 file)

ruff format --preview

templates/neo4j-semantic-ollama/neo4j_semantic_ollama/recommendation_tool.py~L77

         response = graph.query(recommendation_query_db_history, params)
         try:
             return (
-                'Recommended movies are: '
+                "Recommended movies are: "
                 f'{f"###Movie {nl}".join([el["movie"] for el in response])}'
             )
         except Exception:

templates/neo4j-semantic-ollama/neo4j_semantic_ollama/recommendation_tool.py~L87

         response = graph.query(recommendation_query_genre, params)
         try:
             return (
-                'Recommended movies are: '
+                "Recommended movies are: "
                 f'{f"###Movie {nl}".join([el["movie"] for el in response])}'
             )
         except Exception:

templates/neo4j-semantic-ollama/neo4j_semantic_ollama/recommendation_tool.py~L101

     response = graph.query(query, params)
     try:
         return (
-            'Recommended movies are: '
+            "Recommended movies are: "
             f'{f"###Movie {nl}".join([el["movie"] for el in response])}'
         )
     except Exception:

pandas-dev/pandas (+1 -1 lines across 1 file)

ruff format --preview

scripts/validate_docstrings.py~L369

         for err_code in actual_failures - expected_failures:
             sys.stdout.write(
                 f'{prefix}{res["file"]}:{res["file_line"]}:'
-                f'{err_code}:{func_name}:{error_messages[err_code]}\n'
+                f"{err_code}:{func_name}:{error_messages[err_code]}\n"
             )
             exit_status += 1
         for err_code in ignore_errors.get(func_name, set()) - actual_failures:

rotki/rotki (+89 -89 lines across 34 files)

ruff format --preview

rotkehlchen/api/rest.py~L445

             return
 
         log.error(
-            f'{task_str} dies with exception: {greenlet.exception}.\n'
-            f'Exception Name: {greenlet.exc_info[0]}\n'
-            f'Exception Info: {greenlet.exc_info[1]}\n'
+            f"{task_str} dies with exception: {greenlet.exception}.\n"
+            f"Exception Name: {greenlet.exc_info[0]}\n"
+            f"Exception Info: {greenlet.exc_info[1]}\n"
             f'Traceback:\n {"".join(traceback.format_tb(greenlet.exc_info[2]))}',
         )
         # also write an error for the task result if it's not the main greenlet

rotkehlchen/api/rest.py~L1479

         if identifiers is not None:
             return api_response(
                 result=wrap_in_fail_result(
-                    f'Failed to add {asset.asset_type!s} {asset.name} '
+                    f"Failed to add {asset.asset_type!s} {asset.name} "
                     f'since it already exists. Existing ids: {",".join(identifiers)}'
                 ),
                 status_code=HTTPStatus.CONFLICT,

rotkehlchen/api/rest.py~L3231

     ) -> dict[str, Any]:
         """Return the current price of the assets in the target asset currency."""
         log.debug(
-            f'Querying the current {target_asset.identifier} price of these assets: '
+            f"Querying the current {target_asset.identifier} price of these assets: "
             f'{", ".join([asset.identifier for asset in assets])}',
         )
         # Type is list instead of tuple here because you can't serialize a tuple

rotkehlchen/api/rest.py~L3338

         asset currency.
         """
         log.debug(
-            f'Querying the historical {target_asset.identifier} price of these assets: '
+            f"Querying the historical {target_asset.identifier} price of these assets: "
             f'{", ".join(f"{asset.identifier} at {ts}" for asset, ts in assets_timestamp)}',
             assets_timestamp=assets_timestamp,
         )

rotkehlchen/api/v1/fields.py~L532

 
         if self.limit_to is not None and chain_id not in self.limit_to:
             raise ValidationError(
-                f'Given chain_id {value} is not one of '
+                f"Given chain_id {value} is not one of "
                 f'{",".join([str(x) for x in self.limit_to])} as needed by the endpoint',
             )
 

rotkehlchen/api/v1/fields.py~L572

 
         if self.limit_to is not None and chain not in self.limit_to:
             raise ValidationError(
-                f'Given chain {value} is not one of '
+                f"Given chain {value} is not one of "
                 f'{",".join([str(x) for x in self.limit_to])} as needed by the endpoint',
             )
 

rotkehlchen/api/v1/fields.py~L807

 
         if self.limit_to is not None and location not in self.limit_to:
             raise ValidationError(
-                f'Given location {value} is not one of '
+                f"Given location {value} is not one of "
                 f'{",".join([str(x) for x in self.limit_to])} as needed by the endpoint',
             )
 

rotkehlchen/api/v1/fields.py~L929

                 and not any(value.filename.endswith(x) for x in self.allowed_extensions)
             ):  # noqa: E501
                 raise ValidationError(
-                    f'Given file {value.filename} does not end in any of '
+                    f"Given file {value.filename} does not end in any of "
                     f'{",".join(self.allowed_extensions)}',
                 )
 

rotkehlchen/api/v1/fields.py~L949

             path.suffix == x for x in self.allowed_extensions
         ):  # noqa: E501
             raise ValidationError(
-                f'Given file {path} does not end in any of '
+                f"Given file {path} does not end in any of "
                 f'{",".join(self.allowed_extensions)}',
             )
 

rotkehlchen/api/v1/schemas.py~L466

             data["order_by_attributes"]
         ).issubset(valid_ordering_attr):
             error_msg = (
-                f'order_by_attributes for trades can not be '
+                f"order_by_attributes for trades can not be "
                 f'{",".join(set(data["order_by_attributes"]) - valid_ordering_attr)}'
             )
             raise ValidationError(

rotkehlchen/api/v1/schemas.py~L695

             data["order_by_attributes"]
         ).issubset(valid_ordering_attr):
             error_msg = (
-                f'order_by_attributes for history event data can not be '
+                f"order_by_attributes for history event data can not be "
                 f'{",".join(set(data["order_by_attributes"]) - valid_ordering_attr)}'
             )
             raise ValidationError(

rotkehlchen/api/v1/schemas.py~L985

             data["order_by_attributes"]
         ).issubset(valid_ordering_attr):
             error_msg = (
-                f'order_by_attributes for asset movements can not be '
+                f"order_by_attributes for asset movements can not be "
                 f'{",".join(set(data["order_by_attributes"]) - valid_ordering_attr)}'
             )
             raise ValidationError(

rotkehlchen/api/v1/schemas.py~L1179

         raise ValidationError(
             f'Invalid current price oracles in: {", ".join(oracle_names)}. '
             f'Supported oracles are: {", ".join(supported_oracle_names)}. '
-            f'Check there are no repeated ones.',
+            f"Check there are no repeated ones.",
         )
 
 

rotkehlchen/api/v1/schemas.py~L1195

         raise ValidationError(
             f'Invalid historical price oracles in: {", ".join(oracle_names)}. '
             f'Supported oracles are: {", ".join(supported_oracle_names)}. '
-            f'Check there are no repeated ones.',
+            f"Check there are no repeated ones.",
         )
 
 

rotkehlchen/api/v1/schemas.py~L1685

             data["order_by_attributes"]
         ).issubset(valid_ordering_attr):
             error_msg = (
-                f'order_by_attributes for accounting report data can not be '
+                f"order_by_attributes for accounting report data can not be "
                 f'{",".join(set(data["order_by_attributes"]) - valid_ordering_attr)}'
             )
             raise ValidationError(

rotkehlchen/api/v1/schemas.py~L2758

             data["order_by_attributes"]
         ).issubset(valid_ordering_attr):
             error_msg = (
-                f'order_by_attributes for eth2 daily stats can not be '
+                f"order_by_attributes for eth2 daily stats can not be "
                 f'{",".join(set(data["order_by_attributes"]) - valid_ordering_attr)}'
             )
             raise ValidationError(

rotkehlchen/api/v1/schemas.py~L3081

         ):  # noqa: E501
             raise ValidationError(
                 f'timestamp provided {data["timestamp"]} is not the same as the '
-                f'one for the entries provided.',
+                f"one for the entries provided.",
             )
 
 

rotkehlchen/chain/aggregator.py~L780

         unknown_accounts = set(accounts).difference(self.accounts.get(blockchain))
         if len(unknown_accounts) != 0:
             raise InputError(
-                f'Tried to remove unknown {blockchain.value} '
+                f"Tried to remove unknown {blockchain.value} "
                 f'accounts {",".join(unknown_accounts)}',
             )
 

rotkehlchen/chain/ethereum/abi.py~L89

     duplicate_names = set(log_topic_names).intersection(log_data_names)
     if duplicate_names:
         raise DeserializationError(
-            f'The following argument names are duplicated '
+            f"The following argument names are duplicated "
             f"between event inputs: '{', '.join(duplicate_names)}'",
         )
 

rotkehlchen/chain/ethereum/modules/nft/nfts.py~L293

         with self.db.user_write() as write_cursor:
             # Remove NFTs that the user no longer owns from the DB cache
             write_cursor.execute(
-                f'DELETE FROM nfts WHERE owner_address IN '
+                f"DELETE FROM nfts WHERE owner_address IN "
                 f'({",".join("?" * len(addresses))}) AND identifier NOT IN '
                 f'({",".join("?" * len(fresh_nfts_identifiers))})',
                 tuple(addresses) + tuple(fresh_nfts_identifiers),

rotkehlchen/chain/evm/decoding/curve/curve_cache.py~L406

             raise RemoteError(f"Curve pool data {api_pool_data} are missing key {e}") from e
         except DeserializationError as e:
             log.error(
-                f'Could not deserialize evm address while decoding curve pool '
+                f"Could not deserialize evm address while decoding curve pool "
                 f'{api_pool_data["address"]} information from curve api: {e}',
             )
 

rotkehlchen/chain/substrate/manager.py~L110

             return result
 
         raise RemoteError(
-            f'{manager.chain} request failed after trying the following nodes: '
+            f"{manager.chain} request failed after trying the following nodes: "
             f'{", ".join(requested_nodes)}',
         )
 

rotkehlchen/db/dbhandler.py~L1242

         """
         # Assure all are there
         accounts_number = write_cursor.execute(
-            f'SELECT COUNT(*) from blockchain_accounts WHERE blockchain = ? '
+            f"SELECT COUNT(*) from blockchain_accounts WHERE blockchain = ? "
             f'AND account IN ({",".join("?" * len(accounts))})',
             (blockchain.value, *accounts),
         ).fetchone()[0]

rotkehlchen/db/dbhandler.py~L2313

         ]
         for hashes_chunk in get_chunks(hashes_to_remove, n=1000):  # limit num of hashes in a query
             write_cursor.execute(  # delete transactions themselves
-                f'DELETE FROM zksynclite_transactions WHERE tx_hash IN '
+                f"DELETE FROM zksynclite_transactions WHERE tx_hash IN "
                 f'({",".join("?" * len(hashes_chunk))})',
                 hashes_chunk,
             )
             write_cursor.execute(
-                f'DELETE FROM history_events WHERE identifier IN (SELECT H.identifier '
-                f'FROM history_events H INNER JOIN evm_events_info E '
-                f'ON H.identifier=E.identifier AND E.tx_hash IN '
+                f"DELETE FROM history_events WHERE identifier IN (SELECT H.identifier "
+                f"FROM history_events H INNER JOIN evm_events_info E "
+                f"ON H.identifier=E.identifier AND E.tx_hash IN "
                 f'({", ".join(["?"] * len(hashes_chunk))}) AND H.location=?)',
                 hashes_chunk + [Location.ZKSYNC_LITE.serialize_for_db()],
             )

rotkehlchen/db/dbhandler.py~L3152

 
         if len(unknown_tags) != 0:
             raise TagConstraintError(
-                f'When {action} {data_type}, unknown tags '
+                f"When {action} {data_type}, unknown tags "
                 f'{", ".join(unknown_tags)} were found',
             )
 

rotkehlchen/db/dbhandler.py~L3806

         result = {}
         with self.conn.read_ctx() as cursor:
             cursor.execute(
-                f'SELECT identifier, name, collection_name, image_url FROM nfts WHERE '
+                f"SELECT identifier, name, collection_name, image_url FROM nfts WHERE "
                 f'identifier IN ({",".join("?" * len(identifiers))})',
                 identifiers,
             )

rotkehlchen/db/eth2.py~L221

         """
         questionmarks = "?" * len(addresses)
         cursor.execute(
-            f'SELECT S.validator_index FROM eth_staking_events_info S LEFT JOIN '
-            f'history_events H on S.identifier=H.identifier WHERE H.location_label IN '
+            f"SELECT S.validator_index FROM eth_staking_events_info S LEFT JOIN "
+            f"history_events H on S.identifier=H.identifier WHERE H.location_label IN "
             f'({",".join(questionmarks)})',
             addresses,
         )

rotkehlchen/db/eth2.py~L353

         with self.db.user_write() as cursor:
             # Delete from the validators table. This should also delete from daily_staking_details
             cursor.execute(
-                f'DELETE FROM eth2_validators WHERE validator_index IN '
+                f"DELETE FROM eth2_validators WHERE validator_index IN "
                 f'({",".join(question_marks)})',
                 validator_indices,
             )

rotkehlchen/db/eth2.py~L366

             # Delete from the events table, all staking events except for deposits.
             # We keep deposits since they are associated with the address and are EVM transactions
             cursor.execute(
-                f'DELETE FROM history_events WHERE identifier in (SELECT S.identifier '
-                f'FROM eth_staking_events_info S WHERE S.validator_index IN '
+                f"DELETE FROM history_events WHERE identifier in (SELECT S.identifier "
+                f"FROM eth_staking_events_info S WHERE S.validator_index IN "
                 f'({",".join(question_marks)})) AND entry_type != ?',
                 (*validator_indices, HistoryBaseEntryType.ETH_DEPOSIT_EVENT.serialize_for_db()),
             )

rotkehlchen/db/unresolved_conflicts.py~L111

 
             # also query the linked rules information
             cursor.execute(
-                'SELECT accounting_rule, property_name, setting_name FROM '
+                "SELECT accounting_rule, property_name, setting_name FROM "
                 f'linked_rules_properties WHERE accounting_rule IN ({",".join("?" * len(id_to_data))})',  # noqa: E501
                 list(id_to_data.keys()),
             )

rotkehlchen/db/updates.py~L277

             if single_contract_data["abi"] not in remote_id_to_local_id:
                 self.msg_aggregator.add_error(
                     f'ABI with id {single_contract_data["abi"]} was missing in a contracts '
-                    f'update. Please report it to the rotki team.',
+                    f"update. Please report it to the rotki team.",
                 )
                 continue
             new_contracts_data.append((

rotkehlchen/db/updates.py~L310

                 self.msg_aggregator.add_error(
                     f'Could not deserialize address {raw_entry["address"]} or blockchain '
                     f'{raw_entry["blockchain"]} that was seen in a global addressbook update. '
-                    f'Please report it to the rotki team. {e!s}',
+                    f"Please report it to the rotki team. {e!s}",
                 )
                 continue
 

rotkehlchen/exchanges/binance.py~L141

     rate = deserialize_price(binance_trade["price"])
     if binance_trade["symbol"] not in binance_symbols_to_pair:
         raise DeserializationError(
-            f'Error reading a {location!s} trade. Could not find '
+            f"Error reading a {location!s} trade. Could not find "
             f'{binance_trade["symbol"]} in binance_symbols_to_pair',
         )
 

rotkehlchen/exchanges/binance.py~L524

             except DeserializationError:
                 log.error(
                     f'Found {self.name} asset with non-string type {type(entry["asset"])}. '
-                    f'Ignoring its balance query.',
+                    f"Ignoring its balance query.",
                 )
                 continue
 

rotkehlchen/exchanges/binance.py~L959

                     continue
                 except DeserializationError:
                     log.error(
-                        f'Found {self.name} asset with non-string type '
+                        f"Found {self.name} asset with non-string type "
                         f'{type(entry["asset"])}. Ignoring its futures balance query.',
                     )
                     continue

rotkehlchen/exchanges/binance.py~L1039

                     continue
                 except DeserializationError:
                     log.error(
-                        f'Found {self.name} asset with non-string type '
+                        f"Found {self.name} asset with non-string type "
                         f'{type(entry["asset"])}. Ignoring its margined futures balance query.',
                     )
                     continue

rotkehlchen/exchanges/bitfinex.py~L395

 
             if raw_result[timestamp_index] > options["end"]:
                 log.debug(
-                    f'Unexpected result requesting {self.name} {case}. '
-                    f'Result timestamp {raw_result[timestamp_index]} is greater than '
+                    f"Unexpected result requesting {self.name} {case}. "
+                    f"Result timestamp {raw_result[timestamp_index]} is greater than "
                     f'end filter {options["end"]}. Stop requesting.',
                     raw_result=raw_result,
                 )

rotkehlchen/exchanges/bitstamp.py~L474

         payload_string = urlencode(call_options)
         content_type = "" if payload_string == "" else "application/x-www-form-urlencoded"
         message = (
-            'BITSTAMP '
-            f'{self.api_key}'
-            f'{method.upper()}'
+            "BITSTAMP "
+            f"{self.api_key}"
+            f"{method.upper()}"
             f'{request_url.replace("https://", "")}'
-            f'{query_params}'
-            f'{content_type}'
-            f'{nonce}'
-            f'{timestamp}'
-            'v2'
-            f'{payload_string}'
+            f"{query_params}"
+            f"{content_type}"
+            f"{nonce}"
+            f"{timestamp}"
+            "v2"
+            f"{payload_string}"
         )
         signature = hmac.new(
             self.secret,

rotkehlchen/exchanges/coinbase.py~L355

 
             if not isinstance(account_data["id"], str):
                 log.error(
-                    f'Found coinbase account entry with a non string id: '
+                    f"Found coinbase account entry with a non string id: "
                     f'{account_data["id"]}. Skipping it. ',
                 )
                 continue

rotkehlchen/exchanges/coinbase.py~L933

                     if asset != asset_from_coinbase(fee_asset, time=timestamp):
                         # If not we set ZERO fee and ignore
                         log.error(
-                            f'In a coinbase withdrawal of {asset.identifier} the fee'
+                            f"In a coinbase withdrawal of {asset.identifier} the fee"
                             f'is denoted in {raw_fee["currency"]}',
                         )
                     else:

rotkehlchen/exchanges/kraken.py~L503

 
             if count != response["count"]:
                 log.error(
-                    f'Kraken unexpected response while querying endpoint for period. '
+                    f"Kraken unexpected response while querying endpoint for period. "
                     f'Original count was {count} and response returned {response["count"]}',
                 )
                 with_errors = True

rotkehlchen/externalapis/coingecko.py~L840

                 if entry["id"] in data:
                     log.warning(
                         f'Found duplicate coingecko identifier {entry["id"]} when querying '
-                        f'the list of coingecko assets. Ignoring...',
+                        f"the list of coingecko assets. Ignoring...",
                     )
                     continue
 

rotkehlchen/externalapis/cryptocompare.py~L283

         can_query = got_cached_data or not rate_limited
         log.debug(
             f'{"Will" if can_query else "Will not"} query '
-            f'Cryptocompare history for {from_asset.identifier} -> '
-            f'{to_asset.identifier} @ {timestamp}. Cached data: {got_cached_data}'
-            f' rate_limited in last {seconds} seconds: {rate_limited}',
+            f"Cryptocompare history for {from_asset.identifier} -> "
+            f"{to_asset.identifier} @ {timestamp}. Cached data: {got_cached_data}"
+            f" rate_limited in last {seconds} seconds: {rate_limited}",
         )
         return can_query
 

rotkehlchen/externalapis/etherscan.py~L557

                 except DeserializationError as e:
                     log.error(
                         f"Failed to read transaction timestamp {entry['hash']} from {self.chain} "
-                        f'etherscan for {account} in the range {from_ts} to {to_ts}. {e!s}',
+                        f"etherscan for {account} in the range {from_ts} to {to_ts}. {e!s}",
                     )
                     continue
 

rotkehlchen/externalapis/etherscan.py~L570

                 except DeserializationError as e:
                     log.error(
                         f"Failed to read transaction hash {entry['hash']} from {self.chain} "
-                        f'etherscan for {account} in the range {from_ts} to {to_ts}. {e!s}',
+                        f"etherscan for {account} in the range {from_ts} to {to_ts}. {e!s}",
                     )
                     continue
 

rotkehlchen/externalapis/gnosispay.py~L225

         """Return the gnosis pay data matching the given DB data"""
         with self.database.conn.read_ctx() as cursor:
             cursor.execute(
-                f'SELECT tx_hash, timestamp, merchant_name, merchant_city, country, mcc, '
-                f'transaction_symbol, transaction_amount, billing_symbol, billing_amount, '
-                f'reversal_symbol, reversal_amount, reversal_tx_hash '
+                f"SELECT tx_hash, timestamp, merchant_name, merchant_city, country, mcc, "
+                f"transaction_symbol, transaction_amount, billing_symbol, billing_amount, "
+                f"reversal_symbol, reversal_amount, reversal_tx_hash "
                 f'{", identifier " if with_identifier else ""}'
-                f'FROM gnosispay_data WHERE {wherestatement}',
+                f"FROM gnosispay_data WHERE {wherestatement}",
                 bindings,
             )
             if (result := cursor.fetchone()) is None:

rotkehlchen/externalapis/opensea.py~L295

                 else:  # should not happen. That means collections endpoint doesnt return anything
                     raise DeserializationError(
                         f'Could not find collection {entry["collection"]} in opensea collections '
-                        f'endpoint',
+                        f"endpoint",
                     )
 
             last_price_in_usd = last_price_in_eth * eth_usd_price

rotkehlchen/globaldb/assets_management.py~L40

 
     if int(data["version"]) not in ASSETS_FILE_IMPORT_ACCEPTED_GLOBALDB_VERSIONS:
         raise InputError(
-            f'Provided file is for a different version of rotki. GlobalDB File version: '
+            f"Provided file is for a different version of rotki. GlobalDB File version: "
             f'{data["version"]} Accepted GlobalDB version by rotki: {ASSETS_FILE_IMPORT_ACCEPTED_GLOBALDB_VERSIONS}',  # noqa: E501
         )
     if data["assets"] is None:

rotkehlchen/globaldb/handler.py~L445

             # results before hand, in the range from 10 to 100 and this guarantees that the size of
             # the query is small.
             underlying_tokens_query = (
-                f'SELECT parent_token_entry, address, token_kind, weight FROM '
-                f'underlying_tokens_list LEFT JOIN evm_tokens ON '
-                f'underlying_tokens_list.identifier=evm_tokens.identifier '
+                f"SELECT parent_token_entry, address, token_kind, weight FROM "
+                f"underlying_tokens_list LEFT JOIN evm_tokens ON "
+                f"underlying_tokens_list.identifier=evm_tokens.identifier "
                 f'WHERE parent_token_entry IN ({",".join(["?"] * len(assets_info))})'
             )
             # populate all underlying tokens

rotkehlchen/globaldb/handler.py~L1210

                 )
         except sqlite3.IntegrityError as e:
             log.error(
-                f'One of the following asset ids caused a DB IntegrityError ({e!s}): '
+                f"One of the following asset ids caused a DB IntegrityError ({e!s}): "
                 f'{",".join([x.identifier for x in assets])}',
             )  # should not ever happen but need to handle with informative log if it does
 

rotkehlchen/globaldb/upgrades/v2_v3.py~L356

         "KSM",
     )
     cursor.execute(
-        f'SELECT from_asset, to_asset, source_type, timestamp, price FROM '
+        f"SELECT from_asset, to_asset, source_type, timestamp, price FROM "
         f'price_history WHERE (source_type=="A" OR from_asset IN ({",".join(["?"] * len(assets))}))',  # noqa: E501
         assets,
     )

rotkehlchen/greenlets/manager.py~L76

             return
 
         msg = (
-            f'{first_line}.\n'
-            f'Exception Name: {greenlet.exc_info[0]}\nException Info: {greenlet.exc_info[1]}'
+            f"{first_line}.\n"
+            f"Exception Name: {greenlet.exc_info[0]}\nException Info: {greenlet.exc_info[1]}"
             f'\nTraceback:\n {"".join(traceback.format_tb(greenlet.exc_info[2]))}'
         )
         log.error(msg)

rotkehlchen/history/manager.py~L268

 
             if len(exchange_names) != 0:
                 self.msg_aggregator.add_error(
-                    f'Failed to query some events from {location.name} exchanges '
+                    f"Failed to query some events from {location.name} exchanges "
                     f'{",".join(exchange_names)}',
                 )
 

rotkehlchen/serialization/schemas.py~L230

             if weight_sum > ONE:
                 raise ValidationError(
                     f'The sum of underlying token weights for {data["address"]} '
-                    f'is {weight_sum * 100} and exceeds 100%',
+                    f"is {weight_sum * 100} and exceeds 100%",
                 )
             if weight_sum < ONE:
                 raise ValidationError(
                     f'The sum of underlying token weights for {data["address"]} '
-                    f'is {weight_sum * 100} and does not add up to 100%',
+                    f"is {weight_sum * 100} and does not add up to 100%",
                 )
 
     @post_load

rotkehlchen/tasks/manager.py~L1100

         current_greenlets = len(self.greenlet_manager.greenlets) + len(self.api_task_greenlets)
         not_proceed = current_greenlets >= self.max_tasks_num
         log.debug(
-            f'At task scheduling. Current greenlets: {current_greenlets} '
-            f'Max greenlets: {self.max_tasks_num}. '
+            f"At task scheduling. Current greenlets: {current_greenlets} "
+            f"Max greenlets: {self.max_tasks_num}. "
             f'{"Will not schedule" if not_proceed else "Will schedule"}.',
         )
         if not_proceed:

rotkehlchen/tests/exchanges/test_coinbase.py~L1193

             test_warnings.warn(
                 UserWarning(
                     f'Found unknown asset {e.identifier} with symbol {coin["id"]} in Coinbase. '
-                    f'Support for it has to be added',
+                    f"Support for it has to be added",
                 )
             )

rotkehlchen/tests/unit/globaldb/test_globaldb_consistency.py~L148

             identifiers[0]
             for cursor in (old_db_cursor, packaged_db_cursor)
             for identifiers in cursor.execute(
-                f'SELECT identifier FROM evm_tokens WHERE '
+                f"SELECT identifier FROM evm_tokens WHERE "
                 f'protocol IN ({",".join(["?"] * len(IGNORED_PROTOCOLS))}) OR '
-                f'address in (SELECT value FROM general_cache)',
+                f"address in (SELECT value FROM general_cache)",
                 list(IGNORED_PROTOCOLS),
             ).fetchall()
         }

rotkehlchen/tests/unit/test_data_updates.py~L730

     ):
         result = cursor.execute(  # additions are not present already
             f"SELECT {'COUNT(*)' if after_upgrade is False else 'local_id'} "
-            'FROM location_asset_mappings WHERE location IS ? AND exchange_symbol IS ?',
+            "FROM location_asset_mappings WHERE location IS ? AND exchange_symbol IS ?",
             (
                 None
                 if addition["location"] is None

rotkehlchen/tests/utils/ethereum.py~L261

         names = [str(x) for idx, x in enumerate(connect_at_start) if not connected[idx]]
         log.warning(
             f'Did not connect to nodes: {",".join(names)} due to '
-            f'timeout of {NODE_CONNECTION_TIMEOUT}. Connected to {connected}',
+            f"timeout of {NODE_CONNECTION_TIMEOUT}. Connected to {connected}",
         )
 
 

rotkehlchen/tests/utils/substrate.py~L98

     not_connected_nodes = set(node_names) - connected_nodes
     if not_connected_nodes:
         log.info(
-            f'Substrate {chain} tests failed to connect to nodes: '
+            f"Substrate {chain} tests failed to connect to nodes: "
             f'{",".join([str(node) for node in not_connected_nodes])} ',
         )
 

rotkehlchen/tests/utils/substrate.py~L124

     except gevent.Timeout:
         not_connected_nodes = all_nodes - connected
         log.info(
-            f'{substrate_manager.chain} manager failed to connect to '
+            f"{substrate_manager.chain} manager failed to connect to "
             f'nodes: {",".join([str(node) for node in not_connected_nodes])} '
-            f'due to timeout of {NODE_CONNECTION_TIMEOUT}',
+            f"due to timeout of {NODE_CONNECTION_TIMEOUT}",
         )

@MichaReiser
Copy link
Member Author

Uff, I might have to gate this behind preview :( I'm not sure why this isn't local to f-string formatting only.

Black uses the same layout as this PR proposes.

@MichaReiser
Copy link
Member Author

I believe this is related to #13237 because the formatter always preserves the quotes if the expression contains a quote anywhere (even inside a string). This seems odd to me

@dhruvmanila could you help me understand why this is in place?

@MichaReiser
Copy link
Member Author

This might actually be a bug in ruff. For example, it fails to normalise the quotes for

"aaaa" f'bbbbb{'c'}b' 

Where black handles this successfully.

@dhruvmanila
Copy link
Member

I'm looking into this...

@dhruvmanila
Copy link
Member

This might actually be a bug in ruff. For example, it fails to normalise the quotes for

Yeah, this seems like a bug and I think it's the same as #13237

@dhruvmanila
Copy link
Member

@dhruvmanila could you help me understand why this is in place?

Based on the previous comment (#9058 (comment)), I think it was so that the quoting effect remains same throughout the f-string but I'm realizing now that it doesn't matter. For example, I'm guessing that my thought process was to keep this formatted as single quote for both string and f-string ('foo' f'bar {"x"}') even though we can change the string quotes.

@MichaReiser MichaReiser force-pushed the micha/remove-string-literal-implicit-fstring-kind branch from a519890 to 2d542bd Compare October 7, 2024 13:16
@MichaReiser MichaReiser force-pushed the micha/remove-string-literal-implicit-fstring-kind branch from ca44e51 to 0f47c8a Compare October 7, 2024 13:34
@MichaReiser MichaReiser changed the title refactor: Remove StringLiteralKind::InImplicitConcatenatedFString kind Normalize string literal quotes in implicit concatenated f-strings containing a quote character Oct 7, 2024
@MichaReiser MichaReiser changed the title Normalize string literal quotes in implicit concatenated f-strings containing a quote character Normalize implicit concatenated f-string quotes per part Oct 7, 2024
@@ -19,6 +19,13 @@ pub(crate) fn is_f_string_formatting_enabled(context: &PyFormatContext) -> bool
context.is_preview()
}

/// See [#13539](https://github.com/astral-sh/ruff/pull/13539)
pub(crate) fn is_f_string_implicit_concatenated_string_literal_quotes_enabled(
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I created a new preview style for now in case we push out the f-string stabilization. We can merge this change with the f-string preview style if we decide to ship both changes.

@MichaReiser MichaReiser marked this pull request as ready for review October 7, 2024 14:38
@MichaReiser MichaReiser added formatter Related to the formatter preview Related to preview mode features and removed internal An internal refactor or improvement labels Oct 7, 2024
Copy link
Member

@dhruvmanila dhruvmanila left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks for fixing this

crates/ruff_python_formatter/generate.py Show resolved Hide resolved
@MichaReiser MichaReiser enabled auto-merge (squash) October 8, 2024 09:52
@MichaReiser MichaReiser merged commit fc661e1 into main Oct 8, 2024
18 checks passed
@MichaReiser MichaReiser deleted the micha/remove-string-literal-implicit-fstring-kind branch October 8, 2024 09:59
@MichaReiser MichaReiser mentioned this pull request Oct 8, 2024
25 tasks
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
formatter Related to the formatter preview Related to preview mode features
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants