How to get csc.exe path?

[Origin]: https://stackoverflow.com/questions/6660512/how-to-get-csc-exe-path

The best way to find CSC.exe path is running in CLI (Command Line Interpreter) that simple line:

dir /s %WINDIR%\CSC.EXE

dir – shows directory

/s – includes subfolders

%WINDIR%\CSC.EXE – looks in root folder for phrase like “CSC.exe”.

And it is our result: enter image description here

Then we can simply compile example code by line like:

C:\WINDOWS\...\v.4.0.30319\CSC.exe HelloWorld.cs

Regards.

shareimprove this answer

Percentage Encoding of special characters before sending it in the URL

[Origin]: http://stackoverflow.com/questions/16186042/percentage-encoding-of-special-characters-before-sending-it-in-the-url

I need to pass special characters like #,! etc in URL to Facebook,Twitter and such social sites. For that I am replacing such characters with URL Escape Codes.

return valToEncode.Replace("!", "%21").Replace("#", "%23") .Replace("$", "%24").Replace("&", "%26") .Replace("'", "%27").Replace("(", "%28") .Replace(")", "%29").Replace("*", "%2A");

It works for me, but I want to do it more efficiently.Is there any other way to escape such characters? I tried with Server.URLEncode() but Facebook doesn’t render it.

Thanks in advance,
Priya

shareedit

You should use the Uri.EscapeDataString method if you want to have compatibility with RFC3986 standard, where percent-encoding is defined.

For example spaces always will be encoded as %20 character:

var result = Uri.EscapeDataString("a q");
// result == "a%20q"

while for example usage of HttpUtility.UrlEncode (which is by the way internally used by HttpServerUtility.UrlEncode) returns + character:

var result = HttpUtility.UrlEncode("a q") 
// result == "a+q"

What’s more, the behavior of Uri.EscapeDataString is compatible with client side encodeURIComponent javascript method (except the case sensitivity, but RFC3986 says it is irrelevant).

shareedit

Interface vs Abstract Class (general OO)

[Origin]: http://stackoverflow.com/questions/761194/interface-vs-abstract-class-general-oo

I have had recently two telephone interviews where I’ve been asked about the differences between an Interface and an Abstract class. I have explained every aspect of them I could think of, but it seems they are waiting for me to mention something specific, and I don’t know what it is.

From my experience I think the following is true. If I am missing a major point please let me know.

Interface:

Every single Method declared in an Interface will have to be implemented in the subclass. Only Events, Delegates, Properties (C#) and Methods can exist in a Interface. A class can implement multiple Interfaces.

Abstract Class:

Only Abstract methods have to be implemented by the subclass. An Abstract class can have normal methods with implementations. Abstract class can also have class variables beside Events, Delegates, Properties and Methods. A class can only implement one abstract class only due non-existence of Multi-inheritance in C#.

  1. After all that, the interviewer came up with the question “What if you had an Abstract class with only abstract methods? How would that be different from an interface?” I didn’t know the answer but I think it’s the inheritance as mentioned above right?
  2. An another interviewer asked me what if you had a Public variable inside the interface, how would that be different than in Abstract Class? I insisted you can’t have a public variable inside an interface. I didn’t know what he wanted to hear but he wasn’t satisfied either.

See Also:

shareedit

While your question indicates it’s for “general OO”, it really seems to be focusing on .NET use of these terms.

In .NET (similar for Java):

  • interfaces can have no state or implementation
  • a class that implements an interface must provide an implementation of all the methods of that interface
  • abstract classes may contain state (data members) and/or implementation (methods)
  • abstract classes can be inherited without implementing the abstract methods (though such a derived class is abstract itself)
  • interfaces may be multiple-inherited, abstract classes may not (this is probably the key concrete reason for interfaces to exist separately from abtract classes – they permit an implementation of multiple inheritance that removes many of the problems of general MI).

As general OO terms, the differences are not necessarily well-defined. For example, there are C++ programmers who may hold similar rigid definitions (interfaces are a strict subset of abstract classes that cannot contain implementation), while some may say that an abstract class with some default implementations is still an interface or that a non-abstract class can still define an interface.

Indeed, there is a C++ idiom called the Non-Virtual Interface (NVI) where the public methods are non-virtual methods that ‘thunk’ to private virtual methods:

shareedit

Comparison : interface methods vs virtual methods vs abstract methods

[Origin]: http://stackoverflow.com/questions/4762930/comparison-interface-methods-vs-virtual-methods-vs-abstract-methods

What are the advantages and disadvantages of each of these?

  • interface methods
  • virtual methods
  • abstract methods

When one should choose what? What are the points one should keep in mind when making this decision?

shareedit

Virtual and abstract are almost the same. A virtual method has an implementation in the base class that can optionally be overridden, while an abstract method hasn’t and must be overridden in a child class. Otherwise they are the same. Choosing between them depends on the situation. If you got a base implementation, you use virtual. If you don’t, and you need every descendant to implement it for itself, you choose abstract.

Interface methods are implementations of a method that is declared in an interface that the class implements. This is quite unrelated to the other two. I think a method can be both virtual and interface. The advantage of interfaces is that you declare one interface (duh) that can be implemented by two totally different classes. That way, you can run the same code on two different classes, as long as the methods you’d like to call are declared in an interface they share.

shareedit

How to convert byte[] to string?

[Origin]: http://stackoverflow.com/questions/1003275/how-to-convert-byte-to-string

I have a byte[] array that is loaded from a file that I happen to known contains UTF-8. In some debugging code, I need to convert it to a string. Is there a one liner that will do this?

Under the covers it should be just an allocation and a memcopy, so even if it is not implemented, it should be possible.

shareedit
string result = System.Text.Encoding.UTF8.GetString(byteArray);
shareedit

Using .NET how to convert ISO 8859-1 encoded text files that contain Latin-1 accented characters to UTF-8

[Origin]: http://stackoverflow.com/questions/2595442/using-net-how-to-convert-iso-8859-1-encoded-text-files-that-contain-latin-1-acc

I am being sent text files saved in ISO 88591-1 format that contain accented characters from the Latin-1 range (as well as normal ASCII a-z, etc.). How do I convert these files to UTF-8 using C# so that the single-byte accented characters in ISO 8859-1 become valid UTF-8 characters?

I have tried to use a StreamReader with ASCIIEncoding, and then converting the ASCII string to UTF-8 by instantiating encoding ascii and encoding utf8 and then using Encoding.Convert(ascii, utf8, ascii.GetBytes( asciiString) ) — but the accented characters are being rendered as question marks.

What step am I missing?

shareedit

You need to get the proper Encoding object. ASCII is just as it’s named: ASCII, meaning that it only supports 7-bit ASCII characters. If what you want to do is convert files, then this is likely easier than dealing with the byte arrays directly.

using (System.IO.StreamReader reader = new System.IO.StreamReader(fileName,
                                       Encoding.GetEncoding("iso-8859-1")))
{
    using (System.IO.StreamWriter writer = new System.IO.StreamWriter(
                                           outFileName, Encoding.UTF8))
    {
        writer.Write(reader.ReadToEnd());
    }
}

However, if you want to have the byte arrays yourself, it’s easy enough to do with Encoding.Convert.

byte[] converted = Encoding.Convert(Encoding.GetEncoding("iso-8859-1"), 
    Encoding.UTF8, data);

It’s important to note here, however, that if you want to go down this road then you should not use an encoding-based string reader like StreamReader for your file IO. FileStream would be better suited, as it will read the actual bytes of the files.

In the interest of fully exploring the issue, something like this would work:

using (System.IO.FileStream input = new System.IO.FileStream(fileName,
                                    System.IO.FileMode.Open, 
                                    System.IO.FileAccess.Read))
{
    byte[] buffer = new byte[input.Length];

    int readLength = 0;

    while (readLength < buffer.Length) 
        readLength += input.Read(buffer, readLength, buffer.Length - readLength);

    byte[] converted = Encoding.Convert(Encoding.GetEncoding("iso-8859-1"), 
                       Encoding.UTF8, buffer);

    using (System.IO.FileStream output = new System.IO.FileStream(outFileName,
                                         System.IO.FileMode.Create, 
                                         System.IO.FileAccess.Write))
    {
        output.Write(converted, 0, converted.Length);
    }
}

In this example, the buffer variable gets filled with the actual data in the file as a byte[], so no conversion is done. Encoding.Convert specifies a source and destination encoding, then stores the converted bytes in the variable named…converted. This is then written to the output file directly.

Like I said, the first option using StreamReader and StreamWriter will be much simpler if this is all you’re doing, but the latter example should give you more of a hint as to what’s actually going on.

shareedit

C# Convert string from UTF-8 to ISO-8859-1 (Latin1) H

[Origin]: http://stackoverflow.com/questions/1922199/c-sharp-convert-string-from-utf-8-to-iso-8859-1-latin1-h

I have googled on this topic and I have looked at every answer, but I still don’t get it.

Basically I need to convert UTF-8 string to ISO-8859-1 and I do it using following code:

Encoding iso = Encoding.GetEncoding("ISO-8859-1");
Encoding utf8 = Encoding.UTF8;
string msg = iso.GetString(utf8.GetBytes(Message));

My source string is

Message = "ÄäÖöÕõÜü"

But unfortunately my result string becomes

msg = "�ä�ö�õ�ü

What I’m doing wrong here?

shareedit

Use Encoding.Convert to adjust the byte array before attempting to decode it into your destination encoding.

Encoding iso = Encoding.GetEncoding("ISO-8859-1");
Encoding utf8 = Encoding.UTF8;
byte[] utfBytes = utf8.GetBytes(Message);
byte[] isoBytes = Encoding.Convert(utf8, iso, utfBytes);
string msg = iso.GetString(isoBytes);
shareedit
1
The one liner is Encoding.GetEncoding("ISO-8859-1").GetString(Encoding.Conver‌t(Encoding.UTF8, Encoding.GetEncoding("ISO-8859-1"), Encoding.UTF8.GetBytes(myString)))– Björn Ali Göransson Dec 11 ’15 at 15:35