You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
$r=get-sensor-id $id[1..890]
'Get-Sensor' timed out: The underlying connection was closed: The connection was closed unexpectedly. Retries remaining: 1
The underlying connection was closed: The connection was closed unexpectedly.
At line:1 char:6+$r=get-sensor-id $id[1..890]
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-Sensor], WebException
+ FullyQualifiedErrorId : System.Net.WebException,PrtgAPI.PowerShell.Cmdlets.GetSensor
I suspect this is due to the query being a GET, and the GET URL is hitting the limit. Doing it as a URLBody POST might resolve. Alternatively, have a -BatchSize parameter or something to break into multiple queries on that size.
The text was updated successfully, but these errors were encountered:
You are exactly right, this is the issue. I am already aware of the limitations of executing large requests, and have already taken steps to work around them when executing certain known requests (such as specifying a large number of objects to Pause-Object or specifying a large number of values for a particular parameter (such as the services of a new WMI Service object)), however the correct response is in fact to utilize POST instead of GET.
However, upon attempting to implement this recently as part of #59 I discovered that in fact most API requests do not work properly when using POST. If I recall correctly this is due to the fact we are attempting to authenticate via our manually specified username and passhash, rather than using a stored cookie. Hacks to split the request so that the username and passhash are specified in the URL and the rest is specified int he body did not go over well. I don't believe we want to be using a cookie everywhere, as then we have to deal with issues like what to do when it expires and how to renew it, etc.
What really needs to be done in lieu of proper support for extremely large queries is catch the fact the maximum allowed length has been exceeded, and display a more descriptive error message stating that was the fact. This is not something you are going to get a nice parameter for to workaround, as you can max out the URL length using a wide variety of values in all sorts of scenarios that can't necessarily be coherently split into multiple requests.
I suspect this is due to the query being a GET, and the GET URL is hitting the limit. Doing it as a URLBody POST might resolve. Alternatively, have a -BatchSize parameter or something to break into multiple queries on that size.
The text was updated successfully, but these errors were encountered: