Issue #14336 has been updated by rosenfeld (Rodrigo Rosenfeld Rosas).


Thanks, Matz, it certainly helps, but there are plenty of cases where we are not in the control of how hashes are serialized/deserialized. For example, when using Redis, the interface accepts a hash and it will serialize behind the scenes using strings as keys but you have no choice upon deserialization.

But even if you had, that wouldn't completely fix the issue, because part of the application might use strings as keys while another part of the application might use symbols. Unless this information was stored by Redis in the database, Redis couldn't know whether symbols or strings should be used when deserializing a particular stored value.

This sort of things happen all the time and are not specific to just JSON or even just JSON and Redis. Other examples could include PostgreSQL json columns, ElasticSolr interfaces and so forth. Pretending there's an easy solution to the serialization problem that only exists due to the existence of symbols doesn't help to improve the situation. There's also the problem that forces us, Ruby developers, to always look at the documentation or other's code, just to figure out whether symbols or strings are supposed to be used as hash keys.

That's why I'd like to see at least some sort of out-of-the-box HWIA-like solution. That, when implemented as I suggested, should allow a great improvement while not being backward incompatible.

----------------------------------------
Feature #14336: Create new method String#symbol? and deprecate Symbol class
https://bugs.ruby-lang.org/issues/14336#change-69679

* Author: dsferreira (Daniel Ferreira)
* Status: Rejected
* Priority: Normal
* Assignee: 
* Target version: 
----------------------------------------
From  the discussions on the three previous issues related to the String vs Symbol subject ([5964](https://bugs.ruby-lang.org/issues/5964), [7792](https://bugs.ruby-lang.org/issues/7792), [14277](https://bugs.ruby-lang.org/issues/14277)) there are some conclusions we can assume:
* Current String vs Symbol is not the ideal scenario. See: Matz and Koichi comments.
* Current philosophy is to use Symbols as identifiers and Strings when strings are needed.
* Current situation is that Symbols are being used in many code bases as strings except for strings that really need the String methods.
* Current situation is that we are designing APIs to handle both String and Symbol inputs forcing an overhead of API development.

I propose the deprecation of `Symbol` class and the introduction of `String#symbol?`.

```ruby
foo = :foo
foo.class # => String
foo.symbol? # => true
bar = "bar"
bar.class # => String
bar.symbol? # => false
```

For backwards compatibility transition path I propose:

```ruby
class Symbol
  def self.===(var)
    warn ("Warning message regarding deprecated class")
    if var.class == Symbol
      true
    elsif var.class == String && var.symbol?
      true
    else
      false
    end
  end
end

class String
  def is_a?(klass)
    case klass
    when String
      true
    when Symbol
      self.symbol?
    else
      false
    end
  end
end
```



-- 
https://bugs.ruby-lang.org/

Unsubscribe: <mailto:ruby-core-request / ruby-lang.org?subject=unsubscribe>
<http://lists.ruby-lang.org/cgi-bin/mailman/options/ruby-core>